[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 29946 1726882573.37479: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 29946 1726882573.37927: Added group all to inventory 29946 1726882573.37929: Added group ungrouped to inventory 29946 1726882573.37932: Group all now contains ungrouped 29946 1726882573.37935: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 29946 1726882573.56960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 29946 1726882573.57020: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 29946 1726882573.57043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 29946 1726882573.57102: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 29946 1726882573.57174: Loaded config def from plugin (inventory/script) 29946 1726882573.57176: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 29946 1726882573.57220: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 29946 1726882573.57308: Loaded config def from plugin (inventory/yaml) 29946 1726882573.57311: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 29946 1726882573.57395: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 29946 1726882573.57818: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 29946 1726882573.57822: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 29946 1726882573.57825: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 29946 1726882573.57831: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 29946 1726882573.57835: Loading data from /tmp/network-Kc3/inventory.yml 29946 1726882573.57907: /tmp/network-Kc3/inventory.yml was not parsable by auto 29946 1726882573.57972: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 29946 1726882573.58016: Loading data from /tmp/network-Kc3/inventory.yml 29946 1726882573.58097: group all already in inventory 29946 1726882573.58104: set inventory_file for managed_node1 29946 1726882573.58108: set inventory_dir for managed_node1 29946 1726882573.58109: Added host managed_node1 to inventory 29946 1726882573.58111: Added host managed_node1 to group all 29946 1726882573.58112: set ansible_host for managed_node1 29946 1726882573.58113: set ansible_ssh_extra_args for managed_node1 29946 1726882573.58116: set inventory_file for managed_node2 29946 1726882573.58118: set inventory_dir for managed_node2 29946 1726882573.58119: Added host managed_node2 to inventory 29946 1726882573.58120: Added host managed_node2 to group all 29946 1726882573.58121: set ansible_host for managed_node2 29946 1726882573.58122: set ansible_ssh_extra_args for managed_node2 29946 1726882573.58124: set inventory_file for managed_node3 29946 1726882573.58126: set inventory_dir for managed_node3 29946 1726882573.58127: Added host managed_node3 to inventory 29946 1726882573.58128: Added host managed_node3 to group all 29946 1726882573.58129: set ansible_host for managed_node3 29946 1726882573.58130: set ansible_ssh_extra_args for managed_node3 29946 1726882573.58132: Reconcile groups and hosts in inventory. 29946 1726882573.58136: Group ungrouped now contains managed_node1 29946 1726882573.58137: Group ungrouped now contains managed_node2 29946 1726882573.58139: Group ungrouped now contains managed_node3 29946 1726882573.58212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 29946 1726882573.58340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 29946 1726882573.58389: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 29946 1726882573.58420: Loaded config def from plugin (vars/host_group_vars) 29946 1726882573.58422: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 29946 1726882573.58429: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 29946 1726882573.58437: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 29946 1726882573.58478: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 29946 1726882573.58803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882573.58896: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 29946 1726882573.58939: Loaded config def from plugin (connection/local) 29946 1726882573.58942: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 29946 1726882573.59554: Loaded config def from plugin (connection/paramiko_ssh) 29946 1726882573.59557: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 29946 1726882573.60387: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29946 1726882573.60429: Loaded config def from plugin (connection/psrp) 29946 1726882573.60432: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 29946 1726882573.61117: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29946 1726882573.61157: Loaded config def from plugin (connection/ssh) 29946 1726882573.61159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 29946 1726882573.62964: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 29946 1726882573.63005: Loaded config def from plugin (connection/winrm) 29946 1726882573.63008: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 29946 1726882573.63037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 29946 1726882573.63099: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 29946 1726882573.63162: Loaded config def from plugin (shell/cmd) 29946 1726882573.63164: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 29946 1726882573.63190: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 29946 1726882573.63251: Loaded config def from plugin (shell/powershell) 29946 1726882573.63253: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 29946 1726882573.63300: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 29946 1726882573.63482: Loaded config def from plugin (shell/sh) 29946 1726882573.63484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 29946 1726882573.63515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 29946 1726882573.63617: Loaded config def from plugin (become/runas) 29946 1726882573.63620: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 29946 1726882573.64032: Loaded config def from plugin (become/su) 29946 1726882573.64035: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 29946 1726882573.64195: Loaded config def from plugin (become/sudo) 29946 1726882573.64198: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 29946 1726882573.64232: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 29946 1726882573.65020: in VariableManager get_vars() 29946 1726882573.65043: done with get_vars() 29946 1726882573.65373: trying /usr/local/lib/python3.12/site-packages/ansible/modules 29946 1726882573.69778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 29946 1726882573.69889: in VariableManager get_vars() 29946 1726882573.69897: done with get_vars() 29946 1726882573.69900: variable 'playbook_dir' from source: magic vars 29946 1726882573.69901: variable 'ansible_playbook_python' from source: magic vars 29946 1726882573.69901: variable 'ansible_config_file' from source: magic vars 29946 1726882573.69902: variable 'groups' from source: magic vars 29946 1726882573.69903: variable 'omit' from source: magic vars 29946 1726882573.69904: variable 'ansible_version' from source: magic vars 29946 1726882573.69904: variable 'ansible_check_mode' from source: magic vars 29946 1726882573.69905: variable 'ansible_diff_mode' from source: magic vars 29946 1726882573.69906: variable 'ansible_forks' from source: magic vars 29946 1726882573.69906: variable 'ansible_inventory_sources' from source: magic vars 29946 1726882573.69907: variable 'ansible_skip_tags' from source: magic vars 29946 1726882573.69908: variable 'ansible_limit' from source: magic vars 29946 1726882573.69909: variable 'ansible_run_tags' from source: magic vars 29946 1726882573.69909: variable 'ansible_verbosity' from source: magic vars 29946 1726882573.69945: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml 29946 1726882573.71332: in VariableManager get_vars() 29946 1726882573.71350: done with get_vars() 29946 1726882573.71389: in VariableManager get_vars() 29946 1726882573.71403: done with get_vars() 29946 1726882573.71436: in VariableManager get_vars() 29946 1726882573.71448: done with get_vars() 29946 1726882573.71747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ 29946 1726882573.71860: in VariableManager get_vars() 29946 1726882573.71874: done with get_vars() 29946 1726882573.71879: variable 'omit' from source: magic vars 29946 1726882573.72102: variable 'omit' from source: magic vars 29946 1726882573.72137: in VariableManager get_vars() 29946 1726882573.72148: done with get_vars() 29946 1726882573.72192: in VariableManager get_vars() 29946 1726882573.72206: done with get_vars() 29946 1726882573.72240: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29946 1726882573.72584: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29946 1726882573.72924: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29946 1726882573.74489: in VariableManager get_vars() 29946 1726882573.74511: done with get_vars() 29946 1726882573.74973: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882573.79806: in VariableManager get_vars() 29946 1726882573.79810: done with get_vars() 29946 1726882573.79813: variable 'playbook_dir' from source: magic vars 29946 1726882573.79814: variable 'ansible_playbook_python' from source: magic vars 29946 1726882573.79815: variable 'ansible_config_file' from source: magic vars 29946 1726882573.79815: variable 'groups' from source: magic vars 29946 1726882573.79816: variable 'omit' from source: magic vars 29946 1726882573.79817: variable 'ansible_version' from source: magic vars 29946 1726882573.79818: variable 'ansible_check_mode' from source: magic vars 29946 1726882573.79818: variable 'ansible_diff_mode' from source: magic vars 29946 1726882573.79819: variable 'ansible_forks' from source: magic vars 29946 1726882573.79820: variable 'ansible_inventory_sources' from source: magic vars 29946 1726882573.79820: variable 'ansible_skip_tags' from source: magic vars 29946 1726882573.79821: variable 'ansible_limit' from source: magic vars 29946 1726882573.79822: variable 'ansible_run_tags' from source: magic vars 29946 1726882573.79822: variable 'ansible_verbosity' from source: magic vars 29946 1726882573.79854: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 29946 1726882573.79926: in VariableManager get_vars() 29946 1726882573.79929: done with get_vars() 29946 1726882573.79932: variable 'playbook_dir' from source: magic vars 29946 1726882573.79933: variable 'ansible_playbook_python' from source: magic vars 29946 1726882573.79933: variable 'ansible_config_file' from source: magic vars 29946 1726882573.79934: variable 'groups' from source: magic vars 29946 1726882573.79935: variable 'omit' from source: magic vars 29946 1726882573.79935: variable 'ansible_version' from source: magic vars 29946 1726882573.79936: variable 'ansible_check_mode' from source: magic vars 29946 1726882573.79937: variable 'ansible_diff_mode' from source: magic vars 29946 1726882573.79938: variable 'ansible_forks' from source: magic vars 29946 1726882573.79938: variable 'ansible_inventory_sources' from source: magic vars 29946 1726882573.79939: variable 'ansible_skip_tags' from source: magic vars 29946 1726882573.79940: variable 'ansible_limit' from source: magic vars 29946 1726882573.79941: variable 'ansible_run_tags' from source: magic vars 29946 1726882573.79941: variable 'ansible_verbosity' from source: magic vars 29946 1726882573.79971: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 29946 1726882573.80040: in VariableManager get_vars() 29946 1726882573.80052: done with get_vars() 29946 1726882573.80096: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29946 1726882573.80214: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29946 1726882573.80290: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29946 1726882573.80669: in VariableManager get_vars() 29946 1726882573.80689: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882573.83116: in VariableManager get_vars() 29946 1726882573.83130: done with get_vars() 29946 1726882573.83164: in VariableManager get_vars() 29946 1726882573.83167: done with get_vars() 29946 1726882573.83169: variable 'playbook_dir' from source: magic vars 29946 1726882573.83170: variable 'ansible_playbook_python' from source: magic vars 29946 1726882573.83170: variable 'ansible_config_file' from source: magic vars 29946 1726882573.83171: variable 'groups' from source: magic vars 29946 1726882573.83172: variable 'omit' from source: magic vars 29946 1726882573.83172: variable 'ansible_version' from source: magic vars 29946 1726882573.83173: variable 'ansible_check_mode' from source: magic vars 29946 1726882573.83174: variable 'ansible_diff_mode' from source: magic vars 29946 1726882573.83174: variable 'ansible_forks' from source: magic vars 29946 1726882573.83175: variable 'ansible_inventory_sources' from source: magic vars 29946 1726882573.83176: variable 'ansible_skip_tags' from source: magic vars 29946 1726882573.83177: variable 'ansible_limit' from source: magic vars 29946 1726882573.83177: variable 'ansible_run_tags' from source: magic vars 29946 1726882573.83178: variable 'ansible_verbosity' from source: magic vars 29946 1726882573.83209: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 29946 1726882573.83292: in VariableManager get_vars() 29946 1726882573.83508: done with get_vars() 29946 1726882573.83546: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 29946 1726882573.83648: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 29946 1726882573.83797: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 29946 1726882573.84192: in VariableManager get_vars() 29946 1726882573.84212: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882573.85827: in VariableManager get_vars() 29946 1726882573.85840: done with get_vars() 29946 1726882573.85873: in VariableManager get_vars() 29946 1726882573.85884: done with get_vars() 29946 1726882573.85918: in VariableManager get_vars() 29946 1726882573.85929: done with get_vars() 29946 1726882573.85992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 29946 1726882573.86010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 29946 1726882573.86238: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 29946 1726882573.86411: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 29946 1726882573.86414: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 29946 1726882573.86445: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 29946 1726882573.86470: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 29946 1726882573.86631: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 29946 1726882573.86689: Loaded config def from plugin (callback/default) 29946 1726882573.86692: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 29946 1726882573.88685: Loaded config def from plugin (callback/junit) 29946 1726882573.88687: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 29946 1726882573.88733: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 29946 1726882573.88850: Loaded config def from plugin (callback/minimal) 29946 1726882573.88853: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 29946 1726882573.88931: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 29946 1726882573.88991: Loaded config def from plugin (callback/tree) 29946 1726882573.88995: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 29946 1726882573.89120: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 29946 1726882573.89122: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_routing_rules_nm.yml ******************************************* 6 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml 29946 1726882573.89148: in VariableManager get_vars() 29946 1726882573.89161: done with get_vars() 29946 1726882573.89167: in VariableManager get_vars() 29946 1726882573.89175: done with get_vars() 29946 1726882573.89179: variable 'omit' from source: magic vars 29946 1726882573.89216: in VariableManager get_vars() 29946 1726882573.89229: done with get_vars() 29946 1726882573.89249: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_routing_rules.yml' with nm as provider] **** 29946 1726882573.89766: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 29946 1726882573.89837: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 29946 1726882573.89867: getting the remaining hosts for this loop 29946 1726882573.89869: done getting the remaining hosts for this loop 29946 1726882573.89876: getting the next task for host managed_node2 29946 1726882573.89879: done getting next task for host managed_node2 29946 1726882573.89881: ^ task is: TASK: Gathering Facts 29946 1726882573.89883: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882573.89885: getting variables 29946 1726882573.89886: in VariableManager get_vars() 29946 1726882573.89897: Calling all_inventory to load vars for managed_node2 29946 1726882573.89899: Calling groups_inventory to load vars for managed_node2 29946 1726882573.89902: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882573.89913: Calling all_plugins_play to load vars for managed_node2 29946 1726882573.89924: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882573.89927: Calling groups_plugins_play to load vars for managed_node2 29946 1726882573.89958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882573.90013: done with get_vars() 29946 1726882573.90019: done getting variables 29946 1726882573.90201: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 Friday 20 September 2024 21:36:13 -0400 (0:00:00.011) 0:00:00.011 ****** 29946 1726882573.90221: entering _queue_task() for managed_node2/gather_facts 29946 1726882573.90223: Creating lock for gather_facts 29946 1726882573.90552: worker is 1 (out of 1 available) 29946 1726882573.90560: exiting _queue_task() for managed_node2/gather_facts 29946 1726882573.90573: done queuing things up, now waiting for results queue to drain 29946 1726882573.90574: waiting for pending results... 29946 1726882573.91216: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882573.91223: in run() - task 12673a56-9f93-95e7-9dfb-0000000000af 29946 1726882573.91226: variable 'ansible_search_path' from source: unknown 29946 1726882573.91228: calling self._execute() 29946 1726882573.91240: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882573.91253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882573.91267: variable 'omit' from source: magic vars 29946 1726882573.91369: variable 'omit' from source: magic vars 29946 1726882573.91403: variable 'omit' from source: magic vars 29946 1726882573.91442: variable 'omit' from source: magic vars 29946 1726882573.91495: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882573.91538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882573.91569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882573.91590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882573.91607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882573.91641: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882573.91650: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882573.91658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882573.91763: Set connection var ansible_pipelining to False 29946 1726882573.91780: Set connection var ansible_shell_executable to /bin/sh 29946 1726882573.91792: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882573.91805: Set connection var ansible_timeout to 10 29946 1726882573.91817: Set connection var ansible_shell_type to sh 29946 1726882573.91822: Set connection var ansible_connection to ssh 29946 1726882573.91845: variable 'ansible_shell_executable' from source: unknown 29946 1726882573.91887: variable 'ansible_connection' from source: unknown 29946 1726882573.91890: variable 'ansible_module_compression' from source: unknown 29946 1726882573.91894: variable 'ansible_shell_type' from source: unknown 29946 1726882573.91897: variable 'ansible_shell_executable' from source: unknown 29946 1726882573.91899: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882573.91901: variable 'ansible_pipelining' from source: unknown 29946 1726882573.91904: variable 'ansible_timeout' from source: unknown 29946 1726882573.91906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882573.92107: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882573.92129: variable 'omit' from source: magic vars 29946 1726882573.92207: starting attempt loop 29946 1726882573.92210: running the handler 29946 1726882573.92213: variable 'ansible_facts' from source: unknown 29946 1726882573.92215: _low_level_execute_command(): starting 29946 1726882573.92217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882573.92981: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882573.93035: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882573.93062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882573.93162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882573.94875: stdout chunk (state=3): >>>/root <<< 29946 1726882573.95009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882573.95019: stdout chunk (state=3): >>><<< 29946 1726882573.95032: stderr chunk (state=3): >>><<< 29946 1726882573.95142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882573.95151: _low_level_execute_command(): starting 29946 1726882573.95155: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229 `" && echo ansible-tmp-1726882573.950596-29966-193673413028229="` echo /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229 `" ) && sleep 0' 29946 1726882573.95712: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882573.95733: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882573.95754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882573.95845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882573.97733: stdout chunk (state=3): >>>ansible-tmp-1726882573.950596-29966-193673413028229=/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229 <<< 29946 1726882573.97866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882573.97884: stdout chunk (state=3): >>><<< 29946 1726882573.97902: stderr chunk (state=3): >>><<< 29946 1726882573.97924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882573.950596-29966-193673413028229=/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882573.97962: variable 'ansible_module_compression' from source: unknown 29946 1726882573.98033: ANSIBALLZ: Using generic lock for ansible.legacy.setup 29946 1726882573.98041: ANSIBALLZ: Acquiring lock 29946 1726882573.98048: ANSIBALLZ: Lock acquired: 140626579263984 29946 1726882573.98056: ANSIBALLZ: Creating module 29946 1726882574.26818: ANSIBALLZ: Writing module into payload 29946 1726882574.27068: ANSIBALLZ: Writing module 29946 1726882574.27072: ANSIBALLZ: Renaming module 29946 1726882574.27074: ANSIBALLZ: Done creating module 29946 1726882574.27077: variable 'ansible_facts' from source: unknown 29946 1726882574.27096: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882574.27112: _low_level_execute_command(): starting 29946 1726882574.27122: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 29946 1726882574.27975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882574.27991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882574.28010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882574.28027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882574.28057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882574.28069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882574.28168: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882574.28203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882574.28304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882574.29905: stdout chunk (state=3): >>>PLATFORM <<< 29946 1726882574.29972: stdout chunk (state=3): >>>Linux <<< 29946 1726882574.30017: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 29946 1726882574.30029: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 29946 1726882574.30189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882574.30192: stdout chunk (state=3): >>><<< 29946 1726882574.30197: stderr chunk (state=3): >>><<< 29946 1726882574.30331: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882574.30336 [managed_node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 29946 1726882574.30340: _low_level_execute_command(): starting 29946 1726882574.30342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 29946 1726882574.30622: Sending initial data 29946 1726882574.30625: Sent initial data (1181 bytes) 29946 1726882574.30883: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882574.30903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882574.30917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882574.31013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882574.31051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882574.31065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882574.31091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882574.31176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882574.34560: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 29946 1726882574.34935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882574.34961: stderr chunk (state=3): >>><<< 29946 1726882574.34964: stdout chunk (state=3): >>><<< 29946 1726882574.34977: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882574.35036: variable 'ansible_facts' from source: unknown 29946 1726882574.35039: variable 'ansible_facts' from source: unknown 29946 1726882574.35046: variable 'ansible_module_compression' from source: unknown 29946 1726882574.35078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882574.35101: variable 'ansible_facts' from source: unknown 29946 1726882574.35223: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py 29946 1726882574.35323: Sending initial data 29946 1726882574.35326: Sent initial data (153 bytes) 29946 1726882574.35885: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882574.35905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882574.35964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882574.36031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882574.37544: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882574.37548: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882574.37606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882574.37666: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp_ucw4br6 /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py <<< 29946 1726882574.37672: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py" <<< 29946 1726882574.37726: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp_ucw4br6" to remote "/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py" <<< 29946 1726882574.39179: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882574.39182: stdout chunk (state=3): >>><<< 29946 1726882574.39185: stderr chunk (state=3): >>><<< 29946 1726882574.39187: done transferring module to remote 29946 1726882574.39189: _low_level_execute_command(): starting 29946 1726882574.39192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/ /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py && sleep 0' 29946 1726882574.39642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882574.39679: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882574.39691: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882574.39757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882574.41551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882574.41555: stdout chunk (state=3): >>><<< 29946 1726882574.41560: stderr chunk (state=3): >>><<< 29946 1726882574.41582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882574.41585: _low_level_execute_command(): starting 29946 1726882574.41591: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/AnsiballZ_setup.py && sleep 0' 29946 1726882574.42211: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882574.42272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882574.42275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882574.42353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882574.44433: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 29946 1726882574.44462: stdout chunk (state=3): >>>import _imp # builtin <<< 29946 1726882574.44498: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 29946 1726882574.44501: stdout chunk (state=3): >>>import '_weakref' # <<< 29946 1726882574.44597: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 29946 1726882574.44685: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 29946 1726882574.44691: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 29946 1726882574.44721: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.44744: stdout chunk (state=3): >>>import '_codecs' # <<< 29946 1726882574.44769: stdout chunk (state=3): >>>import 'codecs' # <<< 29946 1726882574.44816: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29946 1726882574.44868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd37b30> <<< 29946 1726882574.44872: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 29946 1726882574.44900: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd6aa50> import '_signal' # <<< 29946 1726882574.44938: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 29946 1726882574.44952: stdout chunk (state=3): >>> import 'io' # <<< 29946 1726882574.44985: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 29946 1726882574.45066: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29946 1726882574.45107: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 29946 1726882574.45146: stdout chunk (state=3): >>>import 'os' # <<< 29946 1726882574.45151: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages <<< 29946 1726882574.45185: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 29946 1726882574.45206: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 29946 1726882574.45237: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 29946 1726882574.45250: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb1d130> <<< 29946 1726882574.45318: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb1dfa0> <<< 29946 1726882574.45352: stdout chunk (state=3): >>>import 'site' # <<< 29946 1726882574.45370: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29946 1726882574.45714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29946 1726882574.45723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 29946 1726882574.45747: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 29946 1726882574.45752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.45778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29946 1726882574.45815: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 29946 1726882574.45835: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 29946 1726882574.45854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 29946 1726882574.45880: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb5bdd0> <<< 29946 1726882574.45894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 29946 1726882574.45910: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 29946 1726882574.45930: stdout chunk (state=3): >>>import '_operator' # <<< 29946 1726882574.45935: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb5bfe0> <<< 29946 1726882574.45953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 29946 1726882574.45979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29946 1726882574.46002: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29946 1726882574.46053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.46065: stdout chunk (state=3): >>>import 'itertools' # <<< 29946 1726882574.46103: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 29946 1726882574.46115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb937a0> <<< 29946 1726882574.46130: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb93e30> <<< 29946 1726882574.46153: stdout chunk (state=3): >>>import '_collections' # <<< 29946 1726882574.46205: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb73aa0> <<< 29946 1726882574.46215: stdout chunk (state=3): >>>import '_functools' # <<< 29946 1726882574.46235: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb711c0> <<< 29946 1726882574.46326: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb58f80> <<< 29946 1726882574.46350: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29946 1726882574.46372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 29946 1726882574.46376: stdout chunk (state=3): >>>import '_sre' # <<< 29946 1726882574.46406: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 29946 1726882574.46426: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29946 1726882574.46454: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 29946 1726882574.46485: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb3710> <<< 29946 1726882574.46499: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb2330> <<< 29946 1726882574.46533: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb72090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb0b90> <<< 29946 1726882574.46585: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 29946 1726882574.46590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe8740> <<< 29946 1726882574.46605: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb58200> <<< 29946 1726882574.46622: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 29946 1726882574.46653: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.46658: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bbe8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe8aa0> <<< 29946 1726882574.46694: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.46705: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.46711: stdout chunk (state=3): >>>import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bbe8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb56d20> <<< 29946 1726882574.46741: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 29946 1726882574.46751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.46764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 29946 1726882574.46797: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 29946 1726882574.46815: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe9250> <<< 29946 1726882574.46819: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 29946 1726882574.46850: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 29946 1726882574.46874: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbea480> <<< 29946 1726882574.46885: stdout chunk (state=3): >>>import 'importlib.util' # <<< 29946 1726882574.46898: stdout chunk (state=3): >>>import 'runpy' # <<< 29946 1726882574.46916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29946 1726882574.46953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 29946 1726882574.46977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 29946 1726882574.46980: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc00680> <<< 29946 1726882574.47002: stdout chunk (state=3): >>>import 'errno' # <<< 29946 1726882574.47027: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.47036: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc01d60> <<< 29946 1726882574.47059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 29946 1726882574.47067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29946 1726882574.47098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 29946 1726882574.47104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 29946 1726882574.47118: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc02c00> <<< 29946 1726882574.47151: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.47154: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc03260> <<< 29946 1726882574.47158: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc02150> <<< 29946 1726882574.47190: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 29946 1726882574.47195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 29946 1726882574.47234: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc03ce0> <<< 29946 1726882574.47248: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc03410> <<< 29946 1726882574.47288: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbea4b0> <<< 29946 1726882574.47315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 29946 1726882574.47330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 29946 1726882574.47360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29946 1726882574.47369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29946 1726882574.47413: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b8ffbc0> <<< 29946 1726882574.47428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 29946 1726882574.47455: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b9286e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b928440> <<< 29946 1726882574.47482: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b928710> <<< 29946 1726882574.47520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 29946 1726882574.47523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29946 1726882574.47589: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.47714: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b929040> <<< 29946 1726882574.47825: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.47828: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b929a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9288f0> <<< 29946 1726882574.47858: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b8fdd60> <<< 29946 1726882574.47874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 29946 1726882574.47900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 29946 1726882574.47914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 29946 1726882574.47931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 29946 1726882574.47940: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b92ade0> <<< 29946 1726882574.47965: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b929b50> <<< 29946 1726882574.47977: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbeaba0> <<< 29946 1726882574.48008: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29946 1726882574.48060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.48082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29946 1726882574.48113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 29946 1726882574.48144: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b957140> <<< 29946 1726882574.48191: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29946 1726882574.48211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.48226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 29946 1726882574.48254: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 29946 1726882574.48287: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b977500> <<< 29946 1726882574.48312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 29946 1726882574.48354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 29946 1726882574.48409: stdout chunk (state=3): >>>import 'ntpath' # <<< 29946 1726882574.48429: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.48433: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9d82c0> <<< 29946 1726882574.48449: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 29946 1726882574.48479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29946 1726882574.48503: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29946 1726882574.48544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 29946 1726882574.48626: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9daa20> <<< 29946 1726882574.48702: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9d83e0> <<< 29946 1726882574.48730: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9a12e0> <<< 29946 1726882574.48759: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3253d0> <<< 29946 1726882574.48781: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b976300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b92bd10> <<< 29946 1726882574.48965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 29946 1726882574.48977: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff55b976900> <<< 29946 1726882574.49241: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jlsou5lm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 29946 1726882574.49365: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.49387: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 29946 1726882574.49407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29946 1726882574.49439: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29946 1726882574.49513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 29946 1726882574.49536: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 29946 1726882574.49544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b38b0b0> <<< 29946 1726882574.49557: stdout chunk (state=3): >>>import '_typing' # <<< 29946 1726882574.49743: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b369fa0> <<< 29946 1726882574.49753: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b369130> <<< 29946 1726882574.49764: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.49778: stdout chunk (state=3): >>>import 'ansible' # <<< 29946 1726882574.49790: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.49806: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.49825: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.49828: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 29946 1726882574.49847: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.51237: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.52376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 29946 1726882574.52386: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b388f80> <<< 29946 1726882574.52401: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 29946 1726882574.52407: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.52430: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 29946 1726882574.52435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 29946 1726882574.52461: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 29946 1726882574.52490: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.52495: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3ba960> <<< 29946 1726882574.52523: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3ba6f0> <<< 29946 1726882574.52560: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3ba000> <<< 29946 1726882574.52578: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 29946 1726882574.52581: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 29946 1726882574.52628: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3baa50> <<< 29946 1726882574.52633: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b38bd40> import 'atexit' # <<< 29946 1726882574.52665: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.52675: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bb6b0> <<< 29946 1726882574.52698: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bb8f0><<< 29946 1726882574.52712: stdout chunk (state=3): >>> <<< 29946 1726882574.52718: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 29946 1726882574.52764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 29946 1726882574.52771: stdout chunk (state=3): >>>import '_locale' # <<< 29946 1726882574.52823: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3bbe30> <<< 29946 1726882574.52831: stdout chunk (state=3): >>>import 'pwd' # <<< 29946 1726882574.52844: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 29946 1726882574.52867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29946 1726882574.52906: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b225be0> <<< 29946 1726882574.52934: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.52943: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b227800> <<< 29946 1726882574.52962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 29946 1726882574.52965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29946 1726882574.53010: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2281d0> <<< 29946 1726882574.53024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 29946 1726882574.53055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 29946 1726882574.53067: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b229370> <<< 29946 1726882574.53088: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29946 1726882574.53119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29946 1726882574.53146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 29946 1726882574.53151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 29946 1726882574.53201: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22be30> <<< 29946 1726882574.53233: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.53238: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bbef0> <<< 29946 1726882574.53255: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22a0f0> <<< 29946 1726882574.53275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29946 1726882574.53301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 29946 1726882574.53327: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 29946 1726882574.53331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29946 1726882574.53348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29946 1726882574.53437: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29946 1726882574.53465: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 29946 1726882574.53475: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b233ce0> <<< 29946 1726882574.53485: stdout chunk (state=3): >>>import '_tokenize' # <<< 29946 1726882574.53552: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b232510> <<< 29946 1726882574.53574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 29946 1726882574.53579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29946 1726882574.53659: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b232a80> <<< 29946 1726882574.53680: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22a600> <<< 29946 1726882574.53712: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b277fe0> <<< 29946 1726882574.53740: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.53744: stdout chunk (state=3): >>>import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b278110> <<< 29946 1726882574.53767: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 29946 1726882574.53775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 29946 1726882574.53803: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29946 1726882574.53841: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.53849: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b279bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b279970> <<< 29946 1726882574.53861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29946 1726882574.53896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 29946 1726882574.53939: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.53942: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b27c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27a2a0> <<< 29946 1726882574.53969: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 29946 1726882574.54002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.54027: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 29946 1726882574.54037: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 29946 1726882574.54047: stdout chunk (state=3): >>>import '_string' # <<< 29946 1726882574.54091: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27f860><<< 29946 1726882574.54098: stdout chunk (state=3): >>> <<< 29946 1726882574.54213: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27c230> <<< 29946 1726882574.54272: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b2806b0> <<< 29946 1726882574.54302: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b280890> <<< 29946 1726882574.54351: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.54355: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b280a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b278320> <<< 29946 1726882574.54381: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 29946 1726882574.54406: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29946 1726882574.54425: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29946 1726882574.54454: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.54476: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.54479: stdout chunk (state=3): >>>import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b10c0b0> <<< 29946 1726882574.54627: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.54630: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b10d370> <<< 29946 1726882574.54636: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b282870> <<< 29946 1726882574.54671: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.54678: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b283c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b282510> <<< 29946 1726882574.54699: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.54702: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 29946 1726882574.54725: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.54808: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.54897: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.54908: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.54918: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 29946 1726882574.54939: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 29946 1726882574.54958: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.55076: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.55195: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.55709: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.56234: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 29946 1726882574.56249: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 29946 1726882574.56255: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 29946 1726882574.56276: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 29946 1726882574.56284: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.56339: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b111550> <<< 29946 1726882574.56419: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 29946 1726882574.56430: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 29946 1726882574.56441: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b112300> <<< 29946 1726882574.56448: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9299d0> <<< 29946 1726882574.56482: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 29946 1726882574.56501: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.56525: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.56533: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 29946 1726882574.56546: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.56694: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.56846: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 29946 1726882574.56857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1122d0> <<< 29946 1726882574.56866: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57310: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57756: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57820: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57895: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 29946 1726882574.57901: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57944: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.57970: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 29946 1726882574.57986: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58056: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58135: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29946 1726882574.58143: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58160: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 29946 1726882574.58179: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58217: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58259: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 29946 1726882574.58266: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58491: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 29946 1726882574.58773: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 29946 1726882574.58778: stdout chunk (state=3): >>>import '_ast' # <<< 29946 1726882574.58843: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1134a0> <<< 29946 1726882574.58855: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.58927: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59003: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 29946 1726882574.59014: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 29946 1726882574.59023: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 29946 1726882574.59033: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59080: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59114: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 29946 1726882574.59128: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59168: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59214: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59271: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59335: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 29946 1726882574.59364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.59446: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b11df10> <<< 29946 1726882574.59477: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b119700> <<< 29946 1726882574.59512: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 29946 1726882574.59516: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 29946 1726882574.59589: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59646: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59678: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.59721: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 29946 1726882574.59728: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.59739: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29946 1726882574.59763: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 29946 1726882574.59780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29946 1726882574.59836: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 29946 1726882574.59853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 29946 1726882574.59870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29946 1726882574.59924: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2068d0> <<< 29946 1726882574.59968: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2fe5a0> <<< 29946 1726882574.60044: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b11e000> <<< 29946 1726882574.60048: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b280d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 29946 1726882574.60082: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60112: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29946 1726882574.60172: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 29946 1726882574.60185: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60201: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60204: stdout chunk (state=3): >>>import 'ansible.modules' # <<< 29946 1726882574.60208: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60270: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60330: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60357: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60366: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60412: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60455: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60492: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60522: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 29946 1726882574.60539: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60607: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60676: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60695: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60730: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 29946 1726882574.60736: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.60913: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.61081: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.61124: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.61186: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 29946 1726882574.61197: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.61205: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 29946 1726882574.61222: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 29946 1726882574.61234: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 29946 1726882574.61258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 29946 1726882574.61275: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b1e80> <<< 29946 1726882574.61302: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 29946 1726882574.61311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 29946 1726882574.61333: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 29946 1726882574.61368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 29946 1726882574.61398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 29946 1726882574.61408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 29946 1726882574.61418: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad27e00> <<< 29946 1726882574.61442: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.61461: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2c230> <<< 29946 1726882574.61512: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b19a810> <<< 29946 1726882574.61523: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b29f0> <<< 29946 1726882574.61556: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b0530> <<< 29946 1726882574.61561: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b1010> <<< 29946 1726882574.61581: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 29946 1726882574.61619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 29946 1726882574.61647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 29946 1726882574.61656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 29946 1726882574.61679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 29946 1726882574.61716: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2f140> <<< 29946 1726882574.61724: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2e9f0> <<< 29946 1726882574.61744: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.61747: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2ebd0> <<< 29946 1726882574.61766: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2de50> <<< 29946 1726882574.61783: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 29946 1726882574.61878: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 29946 1726882574.61896: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2f320> <<< 29946 1726882574.61907: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 29946 1726882574.61939: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 29946 1726882574.61969: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882574.61976: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad91e50> <<< 29946 1726882574.61999: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2fe30> <<< 29946 1726882574.62034: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b0950> import 'ansible.module_utils.facts.timeout' # <<< 29946 1726882574.62042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 29946 1726882574.62057: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62073: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62086: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 29946 1726882574.62098: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62157: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62214: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 29946 1726882574.62224: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62278: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 29946 1726882574.62344: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62347: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62364: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 29946 1726882574.62405: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62434: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 29946 1726882574.62442: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62498: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 29946 1726882574.62609: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 29946 1726882574.62710: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62769: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62827: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.62894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 29946 1726882574.62919: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63368: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 29946 1726882574.63801: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63856: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63908: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63943: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.63978: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 29946 1726882574.63982: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 29946 1726882574.63996: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64022: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64044: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 29946 1726882574.64063: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64115: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64173: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 29946 1726882574.64184: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64218: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64243: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 29946 1726882574.64257: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64286: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64319: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 29946 1726882574.64324: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64408: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 29946 1726882574.64503: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 29946 1726882574.64517: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad934a0> <<< 29946 1726882574.64537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 29946 1726882574.64561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 29946 1726882574.64678: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad92840> <<< 29946 1726882574.64688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 29946 1726882574.64699: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64761: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64825: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 29946 1726882574.64833: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.64925: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 29946 1726882574.65035: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65091: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 29946 1726882574.65216: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65416: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 29946 1726882574.65515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55adcdf10> <<< 29946 1726882574.65636: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2fa70> <<< 29946 1726882574.65648: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 29946 1726882574.65675: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 29946 1726882574.65747: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65822: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.65908: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66015: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66171: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 29946 1726882574.66174: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66208: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 29946 1726882574.66261: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66306: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 29946 1726882574.66363: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 29946 1726882574.66404: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ade1a00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55adcd4f0> <<< 29946 1726882574.66437: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 29946 1726882574.66448: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 29946 1726882574.66492: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 29946 1726882574.66553: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66692: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.66841: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 29946 1726882574.66948: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67044: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67096: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 29946 1726882574.67141: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 29946 1726882574.67165: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882574.67191: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67329: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 29946 1726882574.67485: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67602: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67734: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 29946 1726882574.67738: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67767: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.67803: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.68344: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.68834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 29946 1726882574.68861: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.68954: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 29946 1726882574.69062: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69161: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 29946 1726882574.69268: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69417: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 29946 1726882574.69595: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69598: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 29946 1726882574.69642: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69679: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 29946 1726882574.69720: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69818: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.69905: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70117: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70310: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 29946 1726882574.70333: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70363: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 29946 1726882574.70417: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70454: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70471: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 29946 1726882574.70480: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70536: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 29946 1726882574.70677: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882574.70700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 29946 1726882574.70745: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70799: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 29946 1726882574.70848: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.70912: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 29946 1726882574.70922: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71169: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 29946 1726882574.71433: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71497: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 29946 1726882574.71566: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71599: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 29946 1726882574.71641: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71682: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71705: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 29946 1726882574.71725: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71752: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71788: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 29946 1726882574.71796: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71879: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71949: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 29946 1726882574.71972: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71983: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.71996: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 29946 1726882574.72005: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72047: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 29946 1726882574.72104: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72121: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72142: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72185: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72236: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72302: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 29946 1726882574.72398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 29946 1726882574.72419: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72465: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72521: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 29946 1726882574.72707: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72899: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 29946 1726882574.72909: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.72951: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73051: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 29946 1726882574.73061: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 29946 1726882574.73126: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73183: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 29946 1726882574.73273: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 29946 1726882574.73288: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73369: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.73454: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 29946 1726882574.73460: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 29946 1726882574.73533: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882574.74023: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 29946 1726882574.74032: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 29946 1726882574.74045: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 29946 1726882574.74074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 29946 1726882574.74097: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55abde570> <<< 29946 1726882574.74112: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55abdd1c0> <<< 29946 1726882574.74158: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ade17c0> <<< 29946 1726882574.84374: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 29946 1726882574.84391: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac24c20> <<< 29946 1726882574.84421: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 29946 1726882574.84438: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 29946 1726882574.84462: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac248c0> <<< 29946 1726882574.84523: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 29946 1726882574.84530: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882574.84558: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py <<< 29946 1726882574.84563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac26090> <<< 29946 1726882574.84601: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac25b20> <<< 29946 1726882574.84831: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 29946 1726882575.09717: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparm<<< 29946 1726882575.09723: stdout chunk (state=3): >>>or": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789810688, "block_size": 4096, "block_total": 65519099, "block_available": 63913528, "block_used": 1605571, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_selinux_pyt<<< 29946 1726882575.09762: stdout chunk (state=3): >>>hon_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.048277Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615048277", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.60107421875, "5m": 0.51416015625, "15m": 0.2880859375}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": <<< 29946 1726882575.09774: stdout chunk (state=3): >>>"off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "la<<< 29946 1726882575.09790: stdout chunk (state=3): >>>rge_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882575.10349: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 29946 1726882575.10372: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 29946 1726882575.10390: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time <<< 29946 1726882575.10406: stdout chunk (state=3): >>># cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site <<< 29946 1726882575.10415: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 <<< 29946 1726882575.10452: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib <<< 29946 1726882575.10459: stdout chunk (state=3): >>># cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc <<< 29946 1726882575.10475: stdout chunk (state=3): >>># cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit <<< 29946 1726882575.10486: stdout chunk (state=3): >>># cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 29946 1726882575.10516: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal <<< 29946 1726882575.10520: stdout chunk (state=3): >>># cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text<<< 29946 1726882575.10521: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 29946 1726882575.10549: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 29946 1726882575.10553: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic <<< 29946 1726882575.10579: stdout chunk (state=3): >>># destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process <<< 29946 1726882575.10583: stdout chunk (state=3): >>># cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool <<< 29946 1726882575.10608: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version <<< 29946 1726882575.10629: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network <<< 29946 1726882575.10636: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd <<< 29946 1726882575.10663: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix <<< 29946 1726882575.10684: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd <<< 29946 1726882575.10690: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat <<< 29946 1726882575.10698: stdout chunk (state=3): >>># cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 29946 1726882575.11019: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 29946 1726882575.11030: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 29946 1726882575.11068: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 29946 1726882575.11072: stdout chunk (state=3): >>># destroy _blake2 <<< 29946 1726882575.11078: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 29946 1726882575.11102: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 29946 1726882575.11131: stdout chunk (state=3): >>># destroy ntpath <<< 29946 1726882575.11167: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 29946 1726882575.11173: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 29946 1726882575.11216: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale <<< 29946 1726882575.11219: stdout chunk (state=3): >>># destroy locale # destroy select <<< 29946 1726882575.11222: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 29946 1726882575.11224: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 29946 1726882575.11269: stdout chunk (state=3): >>># destroy selinux <<< 29946 1726882575.11281: stdout chunk (state=3): >>># destroy shutil <<< 29946 1726882575.11288: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 29946 1726882575.11328: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 29946 1726882575.11331: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal <<< 29946 1726882575.11377: stdout chunk (state=3): >>># destroy pickle # destroy _compat_pickle # destroy _pickle <<< 29946 1726882575.11384: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 29946 1726882575.11386: stdout chunk (state=3): >>># destroy multiprocessing.reduction <<< 29946 1726882575.11388: stdout chunk (state=3): >>># destroy selectors <<< 29946 1726882575.11415: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 29946 1726882575.11425: stdout chunk (state=3): >>># destroy base64 <<< 29946 1726882575.11434: stdout chunk (state=3): >>># destroy _ssl <<< 29946 1726882575.11470: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 29946 1726882575.11476: stdout chunk (state=3): >>># destroy json <<< 29946 1726882575.11499: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 29946 1726882575.11523: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection <<< 29946 1726882575.11530: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 29946 1726882575.11582: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 29946 1726882575.11590: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 29946 1726882575.11615: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader <<< 29946 1726882575.11622: stdout chunk (state=3): >>># cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 29946 1726882575.11649: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 29946 1726882575.11653: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 29946 1726882575.11657: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 29946 1726882575.11682: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 29946 1726882575.11701: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 29946 1726882575.11733: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath <<< 29946 1726882575.11737: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases <<< 29946 1726882575.11741: stdout chunk (state=3): >>># cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal <<< 29946 1726882575.11754: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 29946 1726882575.11760: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 29946 1726882575.11903: stdout chunk (state=3): >>># destroy sys.monitoring <<< 29946 1726882575.11912: stdout chunk (state=3): >>># destroy _socket <<< 29946 1726882575.11915: stdout chunk (state=3): >>># destroy _collections <<< 29946 1726882575.11945: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 29946 1726882575.11957: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 29946 1726882575.11978: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib <<< 29946 1726882575.11983: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 29946 1726882575.12014: stdout chunk (state=3): >>># destroy _typing <<< 29946 1726882575.12029: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools <<< 29946 1726882575.12037: stdout chunk (state=3): >>># destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 29946 1726882575.12062: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29946 1726882575.12160: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 29946 1726882575.12165: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit <<< 29946 1726882575.12197: stdout chunk (state=3): >>># destroy _warnings # destroy math # destroy _bisect # destroy time <<< 29946 1726882575.12204: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 29946 1726882575.12225: stdout chunk (state=3): >>># destroy _hashlib <<< 29946 1726882575.12244: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 29946 1726882575.12249: stdout chunk (state=3): >>># destroy itertools <<< 29946 1726882575.12274: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 29946 1726882575.12282: stdout chunk (state=3): >>># clear sys.audit hooks <<< 29946 1726882575.12587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882575.12622: stderr chunk (state=3): >>><<< 29946 1726882575.12625: stdout chunk (state=3): >>><<< 29946 1726882575.12739: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd684d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd37b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bd6aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb1d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb1dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb5bdd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb5bfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb937a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb93e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb73aa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb711c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb58f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb3710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb2330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb72090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbb0b90> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe8740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb58200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bbe8bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe8aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bbe8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bb56d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbe9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbea480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc00680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc01d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc02c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc03260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc02150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55bc03ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bc03410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbea4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b8ffbc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b9286e0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b928440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b928710> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b929040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b929a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9288f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b8fdd60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b92ade0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b929b50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55bbeaba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b957140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b977500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9d82c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9daa20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9d83e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9a12e0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3253d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b976300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b92bd10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7ff55b976900> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_jlsou5lm/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b38b0b0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b369fa0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b369130> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b388f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3ba960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3ba6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3ba000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3baa50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b38bd40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bb6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bb8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b3bbe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b225be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b227800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2281d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b229370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22be30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b3bbef0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22a0f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b233ce0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2327b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b232510> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b232a80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b22a600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b277fe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b278110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b279bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b279970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b27c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b27c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b2806b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b280890> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b280a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b278320> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b10c0b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b10d370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b282870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b283c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b282510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b111550> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b112300> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b9299d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1122d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1134a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55b11df10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b119700> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2068d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b2fe5a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b11e000> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b280d70> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b1e80> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad27e00> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2c230> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b19a810> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b29f0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b0530> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b1010> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2f140> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2e9f0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad2ebd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2de50> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2f320> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ad91e50> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2fe30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55b1b0950> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad934a0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad92840> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55adcdf10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ad2fa70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55ade1a00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55adcd4f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7ff55abde570> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55abdd1c0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ade17c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac24c20> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac248c0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac26090> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7ff55ac25b20> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2958, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 573, "free": 2958}, "nocache": {"free": 3298, "used": 233}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789810688, "block_size": 4096, "block_total": 65519099, "block_available": 63913528, "block_used": 1605571, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.048277Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615048277", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.60107421875, "5m": 0.51416015625, "15m": 0.2880859375}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 29946 1726882575.13572: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882575.13575: _low_level_execute_command(): starting 29946 1726882575.13577: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882573.950596-29966-193673413028229/ > /dev/null 2>&1 && sleep 0' 29946 1726882575.13742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.13745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882575.13748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882575.13750: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.13752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.13805: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.13809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.13878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.15654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.15679: stderr chunk (state=3): >>><<< 29946 1726882575.15683: stdout chunk (state=3): >>><<< 29946 1726882575.15698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.15704: handler run complete 29946 1726882575.15778: variable 'ansible_facts' from source: unknown 29946 1726882575.15852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.16055: variable 'ansible_facts' from source: unknown 29946 1726882575.16114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.16192: attempt loop complete, returning result 29946 1726882575.16197: _execute() done 29946 1726882575.16200: dumping result to json 29946 1726882575.16221: done dumping result, returning 29946 1726882575.16234: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-0000000000af] 29946 1726882575.16236: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000af 29946 1726882575.16489: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000af 29946 1726882575.16492: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882575.16757: no more pending results, returning what we have 29946 1726882575.16759: results queue empty 29946 1726882575.16760: checking for any_errors_fatal 29946 1726882575.16760: done checking for any_errors_fatal 29946 1726882575.16761: checking for max_fail_percentage 29946 1726882575.16762: done checking for max_fail_percentage 29946 1726882575.16763: checking to see if all hosts have failed and the running result is not ok 29946 1726882575.16763: done checking to see if all hosts have failed 29946 1726882575.16764: getting the remaining hosts for this loop 29946 1726882575.16765: done getting the remaining hosts for this loop 29946 1726882575.16767: getting the next task for host managed_node2 29946 1726882575.16771: done getting next task for host managed_node2 29946 1726882575.16772: ^ task is: TASK: meta (flush_handlers) 29946 1726882575.16774: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882575.16776: getting variables 29946 1726882575.16777: in VariableManager get_vars() 29946 1726882575.16796: Calling all_inventory to load vars for managed_node2 29946 1726882575.16798: Calling groups_inventory to load vars for managed_node2 29946 1726882575.16800: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.16807: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.16809: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.16816: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.16933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.17058: done with get_vars() 29946 1726882575.17066: done getting variables 29946 1726882575.17114: in VariableManager get_vars() 29946 1726882575.17120: Calling all_inventory to load vars for managed_node2 29946 1726882575.17121: Calling groups_inventory to load vars for managed_node2 29946 1726882575.17123: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.17125: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.17127: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.17128: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.17217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.17329: done with get_vars() 29946 1726882575.17338: done queuing things up, now waiting for results queue to drain 29946 1726882575.17340: results queue empty 29946 1726882575.17340: checking for any_errors_fatal 29946 1726882575.17342: done checking for any_errors_fatal 29946 1726882575.17342: checking for max_fail_percentage 29946 1726882575.17343: done checking for max_fail_percentage 29946 1726882575.17346: checking to see if all hosts have failed and the running result is not ok 29946 1726882575.17347: done checking to see if all hosts have failed 29946 1726882575.17347: getting the remaining hosts for this loop 29946 1726882575.17348: done getting the remaining hosts for this loop 29946 1726882575.17349: getting the next task for host managed_node2 29946 1726882575.17352: done getting next task for host managed_node2 29946 1726882575.17354: ^ task is: TASK: Include the task 'el_repo_setup.yml' 29946 1726882575.17355: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882575.17356: getting variables 29946 1726882575.17357: in VariableManager get_vars() 29946 1726882575.17364: Calling all_inventory to load vars for managed_node2 29946 1726882575.17365: Calling groups_inventory to load vars for managed_node2 29946 1726882575.17367: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.17370: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.17371: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.17372: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.17455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.17583: done with get_vars() 29946 1726882575.17589: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:11 Friday 20 September 2024 21:36:15 -0400 (0:00:01.274) 0:00:01.285 ****** 29946 1726882575.17639: entering _queue_task() for managed_node2/include_tasks 29946 1726882575.17641: Creating lock for include_tasks 29946 1726882575.17841: worker is 1 (out of 1 available) 29946 1726882575.17854: exiting _queue_task() for managed_node2/include_tasks 29946 1726882575.17864: done queuing things up, now waiting for results queue to drain 29946 1726882575.17866: waiting for pending results... 29946 1726882575.18001: running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' 29946 1726882575.18060: in run() - task 12673a56-9f93-95e7-9dfb-000000000006 29946 1726882575.18071: variable 'ansible_search_path' from source: unknown 29946 1726882575.18103: calling self._execute() 29946 1726882575.18155: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.18161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.18170: variable 'omit' from source: magic vars 29946 1726882575.18245: _execute() done 29946 1726882575.18249: dumping result to json 29946 1726882575.18252: done dumping result, returning 29946 1726882575.18254: done running TaskExecutor() for managed_node2/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-95e7-9dfb-000000000006] 29946 1726882575.18259: sending task result for task 12673a56-9f93-95e7-9dfb-000000000006 29946 1726882575.18353: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000006 29946 1726882575.18356: WORKER PROCESS EXITING 29946 1726882575.18395: no more pending results, returning what we have 29946 1726882575.18400: in VariableManager get_vars() 29946 1726882575.18432: Calling all_inventory to load vars for managed_node2 29946 1726882575.18435: Calling groups_inventory to load vars for managed_node2 29946 1726882575.18437: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.18446: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.18448: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.18451: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.18566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.18679: done with get_vars() 29946 1726882575.18684: variable 'ansible_search_path' from source: unknown 29946 1726882575.18696: we have included files to process 29946 1726882575.18697: generating all_blocks data 29946 1726882575.18698: done generating all_blocks data 29946 1726882575.18698: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29946 1726882575.18699: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29946 1726882575.18701: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 29946 1726882575.19125: in VariableManager get_vars() 29946 1726882575.19134: done with get_vars() 29946 1726882575.19142: done processing included file 29946 1726882575.19143: iterating over new_blocks loaded from include file 29946 1726882575.19144: in VariableManager get_vars() 29946 1726882575.19149: done with get_vars() 29946 1726882575.19150: filtering new block on tags 29946 1726882575.19159: done filtering new block on tags 29946 1726882575.19160: in VariableManager get_vars() 29946 1726882575.19165: done with get_vars() 29946 1726882575.19166: filtering new block on tags 29946 1726882575.19174: done filtering new block on tags 29946 1726882575.19176: in VariableManager get_vars() 29946 1726882575.19183: done with get_vars() 29946 1726882575.19184: filtering new block on tags 29946 1726882575.19194: done filtering new block on tags 29946 1726882575.19196: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node2 29946 1726882575.19199: extending task lists for all hosts with included blocks 29946 1726882575.19227: done extending task lists 29946 1726882575.19227: done processing included files 29946 1726882575.19228: results queue empty 29946 1726882575.19228: checking for any_errors_fatal 29946 1726882575.19230: done checking for any_errors_fatal 29946 1726882575.19230: checking for max_fail_percentage 29946 1726882575.19231: done checking for max_fail_percentage 29946 1726882575.19231: checking to see if all hosts have failed and the running result is not ok 29946 1726882575.19232: done checking to see if all hosts have failed 29946 1726882575.19232: getting the remaining hosts for this loop 29946 1726882575.19233: done getting the remaining hosts for this loop 29946 1726882575.19234: getting the next task for host managed_node2 29946 1726882575.19237: done getting next task for host managed_node2 29946 1726882575.19238: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 29946 1726882575.19239: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882575.19241: getting variables 29946 1726882575.19241: in VariableManager get_vars() 29946 1726882575.19247: Calling all_inventory to load vars for managed_node2 29946 1726882575.19248: Calling groups_inventory to load vars for managed_node2 29946 1726882575.19249: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.19252: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.19254: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.19255: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.19353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.19466: done with get_vars() 29946 1726882575.19472: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:36:15 -0400 (0:00:00.018) 0:00:01.304 ****** 29946 1726882575.19518: entering _queue_task() for managed_node2/setup 29946 1726882575.19705: worker is 1 (out of 1 available) 29946 1726882575.19718: exiting _queue_task() for managed_node2/setup 29946 1726882575.19729: done queuing things up, now waiting for results queue to drain 29946 1726882575.19730: waiting for pending results... 29946 1726882575.19864: running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 29946 1726882575.19926: in run() - task 12673a56-9f93-95e7-9dfb-0000000000c0 29946 1726882575.19936: variable 'ansible_search_path' from source: unknown 29946 1726882575.19939: variable 'ansible_search_path' from source: unknown 29946 1726882575.19966: calling self._execute() 29946 1726882575.20021: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.20025: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.20035: variable 'omit' from source: magic vars 29946 1726882575.20384: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882575.21789: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882575.21838: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882575.21864: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882575.21889: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882575.21917: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882575.21969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882575.21988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882575.22009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882575.22040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882575.22051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882575.22167: variable 'ansible_facts' from source: unknown 29946 1726882575.22214: variable 'network_test_required_facts' from source: task vars 29946 1726882575.22242: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 29946 1726882575.22247: variable 'omit' from source: magic vars 29946 1726882575.22270: variable 'omit' from source: magic vars 29946 1726882575.22297: variable 'omit' from source: magic vars 29946 1726882575.22315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882575.22334: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882575.22350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882575.22362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882575.22371: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882575.22396: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882575.22399: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.22402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.22469: Set connection var ansible_pipelining to False 29946 1726882575.22472: Set connection var ansible_shell_executable to /bin/sh 29946 1726882575.22478: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882575.22483: Set connection var ansible_timeout to 10 29946 1726882575.22491: Set connection var ansible_shell_type to sh 29946 1726882575.22495: Set connection var ansible_connection to ssh 29946 1726882575.22511: variable 'ansible_shell_executable' from source: unknown 29946 1726882575.22513: variable 'ansible_connection' from source: unknown 29946 1726882575.22516: variable 'ansible_module_compression' from source: unknown 29946 1726882575.22518: variable 'ansible_shell_type' from source: unknown 29946 1726882575.22520: variable 'ansible_shell_executable' from source: unknown 29946 1726882575.22522: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.22527: variable 'ansible_pipelining' from source: unknown 29946 1726882575.22529: variable 'ansible_timeout' from source: unknown 29946 1726882575.22533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.22628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882575.22635: variable 'omit' from source: magic vars 29946 1726882575.22640: starting attempt loop 29946 1726882575.22643: running the handler 29946 1726882575.22653: _low_level_execute_command(): starting 29946 1726882575.22659: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882575.23154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882575.23158: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.23160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.23162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.23209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.23223: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.23289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.24872: stdout chunk (state=3): >>>/root <<< 29946 1726882575.24970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.25003: stderr chunk (state=3): >>><<< 29946 1726882575.25007: stdout chunk (state=3): >>><<< 29946 1726882575.25025: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.25035: _low_level_execute_command(): starting 29946 1726882575.25041: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272 `" && echo ansible-tmp-1726882575.2502432-30013-207447013663272="` echo /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272 `" ) && sleep 0' 29946 1726882575.25454: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.25462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882575.25485: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.25489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.25492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.25549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.25552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.25556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.25622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.27468: stdout chunk (state=3): >>>ansible-tmp-1726882575.2502432-30013-207447013663272=/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272 <<< 29946 1726882575.27574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.27600: stderr chunk (state=3): >>><<< 29946 1726882575.27604: stdout chunk (state=3): >>><<< 29946 1726882575.27617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882575.2502432-30013-207447013663272=/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.27657: variable 'ansible_module_compression' from source: unknown 29946 1726882575.27695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882575.27744: variable 'ansible_facts' from source: unknown 29946 1726882575.27998: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py 29946 1726882575.28156: Sending initial data 29946 1726882575.28159: Sent initial data (154 bytes) 29946 1726882575.28725: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.28745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.28836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.30355: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882575.30428: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882575.30489: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpiepgdavr /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py <<< 29946 1726882575.30499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py" <<< 29946 1726882575.30538: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpiepgdavr" to remote "/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py" <<< 29946 1726882575.32160: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.32169: stdout chunk (state=3): >>><<< 29946 1726882575.32179: stderr chunk (state=3): >>><<< 29946 1726882575.32295: done transferring module to remote 29946 1726882575.32298: _low_level_execute_command(): starting 29946 1726882575.32301: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/ /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py && sleep 0' 29946 1726882575.32851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882575.32864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.32882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.32944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.33005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.33018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.33050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.33131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.34889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.34908: stderr chunk (state=3): >>><<< 29946 1726882575.34916: stdout chunk (state=3): >>><<< 29946 1726882575.34937: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.34944: _low_level_execute_command(): starting 29946 1726882575.34953: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/AnsiballZ_setup.py && sleep 0' 29946 1726882575.35604: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882575.35620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.35634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.35649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882575.35709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882575.35714: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.35788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.35813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.35917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.38030: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 29946 1726882575.38053: stdout chunk (state=3): >>>import _imp # builtin <<< 29946 1726882575.38091: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 29946 1726882575.38159: stdout chunk (state=3): >>>import '_io' # <<< 29946 1726882575.38188: stdout chunk (state=3): >>>import 'marshal' # <<< 29946 1726882575.38201: stdout chunk (state=3): >>>import 'posix' # <<< 29946 1726882575.38252: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 29946 1726882575.38255: stdout chunk (state=3): >>>import 'time' # <<< 29946 1726882575.38275: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 29946 1726882575.38320: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 29946 1726882575.38334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 29946 1726882575.38358: stdout chunk (state=3): >>>import 'codecs' # <<< 29946 1726882575.38425: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29946 1726882575.38428: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553bc4d0> <<< 29946 1726882575.38462: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995538bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553bea50> <<< 29946 1726882575.38484: stdout chunk (state=3): >>>import '_signal' # <<< 29946 1726882575.38529: stdout chunk (state=3): >>>import '_abc' # <<< 29946 1726882575.38544: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 29946 1726882575.38569: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 29946 1726882575.38667: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29946 1726882575.38689: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 29946 1726882575.38731: stdout chunk (state=3): >>>import 'os' # <<< 29946 1726882575.38763: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 29946 1726882575.38810: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 29946 1726882575.38814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 29946 1726882575.38829: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553cd130> <<< 29946 1726882575.38897: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 29946 1726882575.38911: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553cdfa0> <<< 29946 1726882575.38927: stdout chunk (state=3): >>>import 'site' # <<< 29946 1726882575.38958: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29946 1726882575.39346: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29946 1726882575.39351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 29946 1726882575.39388: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.39391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29946 1726882575.39453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 29946 1726882575.39456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 29946 1726882575.39480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ebe60> <<< 29946 1726882575.39509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 29946 1726882575.39555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 29946 1726882575.39558: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ebef0> <<< 29946 1726882575.39594: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 29946 1726882575.39608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29946 1726882575.39619: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29946 1726882575.39664: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.39703: stdout chunk (state=3): >>>import 'itertools' # <<< 29946 1726882575.39723: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955223860> <<< 29946 1726882575.39771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955223ef0> <<< 29946 1726882575.39774: stdout chunk (state=3): >>>import '_collections' # <<< 29946 1726882575.39820: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955203b30> import '_functools' # <<< 29946 1726882575.39850: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955201220> <<< 29946 1726882575.39941: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e9010> <<< 29946 1726882575.39987: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29946 1726882575.39990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 29946 1726882575.40047: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 29946 1726882575.40067: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29946 1726882575.40070: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 29946 1726882575.40110: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552437a0> <<< 29946 1726882575.40150: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552423c0> <<< 29946 1726882575.40161: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552020f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ea8d0> <<< 29946 1726882575.40212: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 29946 1726882575.40246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552787d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e8290> <<< 29946 1726882575.40276: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 29946 1726882575.40292: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955278c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955278b30> <<< 29946 1726882575.40325: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955278f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e6db0> <<< 29946 1726882575.40364: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 29946 1726882575.40395: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 29946 1726882575.40432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552795e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552792b0> <<< 29946 1726882575.40450: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 29946 1726882575.40524: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 29946 1726882575.40528: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527a4b0> import 'importlib.util' # <<< 29946 1726882575.40538: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29946 1726882575.40561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 29946 1726882575.40597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552906b0> <<< 29946 1726882575.40632: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.40666: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955291d60> <<< 29946 1726882575.40683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29946 1726882575.40726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 29946 1726882575.40763: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955292c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.40779: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955293260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955292150> <<< 29946 1726882575.40805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 29946 1726882575.40852: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955293ce0> <<< 29946 1726882575.40863: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955293410> <<< 29946 1726882575.40907: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527a420> <<< 29946 1726882575.40922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 29946 1726882575.40951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 29946 1726882575.40988: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29946 1726882575.41026: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29946 1726882575.41030: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.41051: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954f87c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 29946 1726882575.41102: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb0500> <<< 29946 1726882575.41119: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb06b0> <<< 29946 1726882575.41142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29946 1726882575.41212: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.41340: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb1040> <<< 29946 1726882575.41456: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb1a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb08f0> <<< 29946 1726882575.41503: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954f85df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 29946 1726882575.41537: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 29946 1726882575.41560: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb2db0> <<< 29946 1726882575.41581: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb0ec0> <<< 29946 1726882575.41607: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527abd0> <<< 29946 1726882575.41633: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29946 1726882575.41711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.41714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29946 1726882575.41755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 29946 1726882575.41773: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fdf0b0> <<< 29946 1726882575.41844: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29946 1726882575.41848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.41879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 29946 1726882575.41923: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955003440> <<< 29946 1726882575.41939: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 29946 1726882575.41985: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 29946 1726882575.42047: stdout chunk (state=3): >>>import 'ntpath' # <<< 29946 1726882575.42062: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99550601d0> <<< 29946 1726882575.42100: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 29946 1726882575.42110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29946 1726882575.42136: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29946 1726882575.42261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 29946 1726882575.42306: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955062930> <<< 29946 1726882575.42374: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99550602f0> <<< 29946 1726882575.42423: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995502d1f0> <<< 29946 1726882575.42459: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995502d9a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955002240> <<< 29946 1726882575.42471: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb3cb0> <<< 29946 1726882575.42657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 29946 1726882575.42668: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f99550025a0> <<< 29946 1726882575.42938: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ofdzw_xs/ansible_setup_payload.zip' <<< 29946 1726882575.42956: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.43078: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.43095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29946 1726882575.43140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29946 1726882575.43212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 29946 1726882575.43248: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498efc0> import '_typing' # <<< 29946 1726882575.43446: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995496deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995496d070> # zipimport: zlib available <<< 29946 1726882575.43473: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 29946 1726882575.43529: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882575.43540: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 29946 1726882575.44916: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.46042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498ce90> <<< 29946 1726882575.46071: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 29946 1726882575.46097: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 29946 1726882575.46127: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 29946 1726882575.46154: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c2990> <<< 29946 1726882575.46169: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c2750> <<< 29946 1726882575.46206: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c2060> <<< 29946 1726882575.46225: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 29946 1726882575.46269: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c24b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498fc50> import 'atexit' # <<< 29946 1726882575.46301: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c3710> <<< 29946 1726882575.46333: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c3950> <<< 29946 1726882575.46354: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 29946 1726882575.46388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 29946 1726882575.46408: stdout chunk (state=3): >>>import '_locale' # <<< 29946 1726882575.46448: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c3e60> <<< 29946 1726882575.46462: stdout chunk (state=3): >>>import 'pwd' # <<< 29946 1726882575.46500: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 29946 1726882575.46514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29946 1726882575.46530: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954829bb0> <<< 29946 1726882575.46570: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.46595: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995482b7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 29946 1726882575.46611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29946 1726882575.46638: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954830110> <<< 29946 1726882575.46652: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 29946 1726882575.46684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 29946 1726882575.46723: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954831280> <<< 29946 1726882575.46726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29946 1726882575.46744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29946 1726882575.46772: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 29946 1726882575.46832: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954833d40> <<< 29946 1726882575.46871: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.46874: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99551e6ea0> <<< 29946 1726882575.46899: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954832030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29946 1726882575.46923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 29946 1726882575.46966: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29946 1726882575.46974: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29946 1726882575.47081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29946 1726882575.47098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954837d40> <<< 29946 1726882575.47118: stdout chunk (state=3): >>>import '_tokenize' # <<< 29946 1726882575.47185: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836570> <<< 29946 1726882575.47208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29946 1726882575.47283: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836ae0> <<< 29946 1726882575.47334: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954832540> <<< 29946 1726882575.47337: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487bfb0> <<< 29946 1726882575.47390: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487c4d0> <<< 29946 1726882575.47394: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 29946 1726882575.47429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29946 1726882575.47469: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.47479: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487db20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487d8e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29946 1726882575.47511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 29946 1726882575.47564: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.47579: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 29946 1726882575.47645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.47657: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 29946 1726882575.47670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 29946 1726882575.47698: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548837d0> <<< 29946 1726882575.47820: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548801a0> <<< 29946 1726882575.47888: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954884560> <<< 29946 1726882575.47910: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99548847d0> <<< 29946 1726882575.47965: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.47971: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954884b60> <<< 29946 1726882575.48004: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 29946 1726882575.48026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29946 1726882575.48045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29946 1726882575.48074: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.48095: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954710200> <<< 29946 1726882575.48234: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.48252: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954711850> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954886990> <<< 29946 1726882575.48298: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954887d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548865d0> # zipimport: zlib available <<< 29946 1726882575.48319: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 29946 1726882575.48331: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.48403: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.48507: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.48510: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 29946 1726882575.48553: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 29946 1726882575.48557: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.48668: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.48787: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.49296: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.49836: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 29946 1726882575.49866: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.49927: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954715ac0> <<< 29946 1726882575.50001: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 29946 1726882575.50024: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954716900> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954711af0> <<< 29946 1726882575.50072: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 29946 1726882575.50088: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.50125: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 29946 1726882575.50136: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.50271: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.50447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954716870> <<< 29946 1726882575.50458: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.50895: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51364: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51402: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51488: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 29946 1726882575.51518: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51555: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 29946 1726882575.51631: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51729: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29946 1726882575.51743: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 29946 1726882575.51767: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51790: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.51839: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 29946 1726882575.52068: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 29946 1726882575.52342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 29946 1726882575.52350: stdout chunk (state=3): >>>import '_ast' # <<< 29946 1726882575.52419: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954717a70> <<< 29946 1726882575.52424: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52505: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52573: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 29946 1726882575.52585: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 29946 1726882575.52604: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52651: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52681: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 29946 1726882575.52697: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52736: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52783: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52838: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.52906: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 29946 1726882575.52935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.53011: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954722360> <<< 29946 1726882575.53049: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995471dc70> <<< 29946 1726882575.53080: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # <<< 29946 1726882575.53083: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 29946 1726882575.53156: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53214: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53244: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53282: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 29946 1726882575.53290: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.53304: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29946 1726882575.53325: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 29946 1726882575.53342: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29946 1726882575.53400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 29946 1726882575.53419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 29946 1726882575.53432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29946 1726882575.53486: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995480aa80> <<< 29946 1726882575.53529: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549ea750> <<< 29946 1726882575.53608: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954885d60> <<< 29946 1726882575.53618: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954885010> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 29946 1726882575.53624: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53645: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53680: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29946 1726882575.53753: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 29946 1726882575.53769: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 29946 1726882575.53783: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53838: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53906: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53913: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53939: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.53988: stdout chunk (state=3): >>># zipimport: zlib available<<< 29946 1726882575.54004: stdout chunk (state=3): >>> <<< 29946 1726882575.54025: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54064: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 29946 1726882575.54117: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54180: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54248: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54270: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54308: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 29946 1726882575.54313: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54491: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54662: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54705: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.54752: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 29946 1726882575.54770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882575.54782: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 29946 1726882575.54795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 29946 1726882575.54815: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 29946 1726882575.54848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 29946 1726882575.54868: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b2990> <<< 29946 1726882575.54901: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 29946 1726882575.54918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 29946 1726882575.54952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 29946 1726882575.54984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 29946 1726882575.55007: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b0320> <<< 29946 1726882575.55026: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.55045: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b0680> <<< 29946 1726882575.55090: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995479c440> <<< 29946 1726882575.55116: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b34d0> <<< 29946 1726882575.55133: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b10a0> <<< 29946 1726882575.55170: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b0ce0> <<< 29946 1726882575.55173: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 29946 1726882575.55242: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 29946 1726882575.55247: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 29946 1726882575.55271: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 29946 1726882575.55299: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.55315: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b3620> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b2ed0> <<< 29946 1726882575.55350: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b30b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b2300> <<< 29946 1726882575.55371: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 29946 1726882575.55461: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 29946 1726882575.55475: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b3770> <<< 29946 1726882575.55479: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 29946 1726882575.55515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 29946 1726882575.55542: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.55547: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954412270> <<< 29946 1726882575.55573: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99544102c0> <<< 29946 1726882575.55611: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b0d40> <<< 29946 1726882575.55617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 29946 1726882575.55624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 29946 1726882575.55639: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55648: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 29946 1726882575.55668: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55727: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 29946 1726882575.55797: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55848: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55890: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 29946 1726882575.55914: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55928: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55932: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 29946 1726882575.55937: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55970: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.55998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 29946 1726882575.56016: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56058: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 29946 1726882575.56118: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56161: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 29946 1726882575.56215: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56267: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56326: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56382: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 29946 1726882575.56449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 29946 1726882575.56456: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.56912: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57334: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 29946 1726882575.57340: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57398: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57451: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57483: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57514: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 29946 1726882575.57526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 29946 1726882575.57536: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57561: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57599: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 29946 1726882575.57602: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57662: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 29946 1726882575.57732: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57759: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 29946 1726882575.57797: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57829: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 29946 1726882575.57874: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.57946: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 29946 1726882575.58035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 29946 1726882575.58055: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954413740> <<< 29946 1726882575.58077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 29946 1726882575.58109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 29946 1726882575.58222: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954412d80> <<< 29946 1726882575.58228: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 29946 1726882575.58302: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 29946 1726882575.58372: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58461: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58552: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 29946 1726882575.58557: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58626: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 29946 1726882575.58707: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58739: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.58791: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 29946 1726882575.58833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 29946 1726882575.58899: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.58956: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.58961: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995444e390> <<< 29946 1726882575.59135: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995443eb70> <<< 29946 1726882575.59140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 29946 1726882575.59205: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 29946 1726882575.59270: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59352: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59432: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59542: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59686: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 29946 1726882575.59704: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59734: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59780: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 29946 1726882575.59799: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59825: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.59894: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 29946 1726882575.59898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 29946 1726882575.59917: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882575.59934: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954462150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954461d90> import 'ansible.module_utils.facts.system.user' # <<< 29946 1726882575.59958: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882575.59967: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 29946 1726882575.59980: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60022: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 29946 1726882575.60067: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60225: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60375: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 29946 1726882575.60378: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60480: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60580: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60618: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60660: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 29946 1726882575.60675: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 29946 1726882575.60685: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60700: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60725: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.60860: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.61006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # <<< 29946 1726882575.61014: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 29946 1726882575.61131: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.61252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 29946 1726882575.61257: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.61296: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.61327: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.61861: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 29946 1726882575.62388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 29946 1726882575.62402: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62489: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62602: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 29946 1726882575.62605: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62699: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62797: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 29946 1726882575.62803: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.62958: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 29946 1726882575.63125: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63129: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63146: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 29946 1726882575.63197: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63231: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 29946 1726882575.63245: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63340: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63437: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63633: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 29946 1726882575.63835: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 29946 1726882575.63850: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63881: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 29946 1726882575.63930: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63955: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.63980: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 29946 1726882575.63991: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64056: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 29946 1726882575.64131: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64155: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64174: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 29946 1726882575.64190: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64250: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64307: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 29946 1726882575.64312: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64372: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 29946 1726882575.64438: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64696: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.64948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 29946 1726882575.64953: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65016: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65069: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 29946 1726882575.65084: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65118: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65153: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 29946 1726882575.65162: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65199: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 29946 1726882575.65242: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65268: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65308: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 29946 1726882575.65313: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65398: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65472: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 29946 1726882575.65492: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65500: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 29946 1726882575.65517: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65564: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65613: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 29946 1726882575.65634: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65659: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65712: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65762: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65824: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65898: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 29946 1726882575.65916: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 29946 1726882575.65921: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.65977: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 29946 1726882575.66033: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66222: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 29946 1726882575.66425: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66470: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66524: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 29946 1726882575.66530: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66572: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66620: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 29946 1726882575.66626: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66711: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 29946 1726882575.66796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 29946 1726882575.66807: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66891: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.66979: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 29946 1726882575.66985: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 29946 1726882575.67051: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882575.67206: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 29946 1726882575.67234: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 29946 1726882575.67245: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 29946 1726882575.67281: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954263260> <<< 29946 1726882575.67295: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954261e20> <<< 29946 1726882575.67340: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954262db0> <<< 29946 1726882575.68396: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.675878Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615675878", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "an<<< 29946 1726882575.68405: stdout chunk (state=3): >>>sible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882575.68899: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ <<< 29946 1726882575.68907: stdout chunk (state=3): >>># clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path <<< 29946 1726882575.68928: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport <<< 29946 1726882575.68949: stdout chunk (state=3): >>># cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path <<< 29946 1726882575.68954: stdout chunk (state=3): >>># cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator <<< 29946 1726882575.68972: stdout chunk (state=3): >>># cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib <<< 29946 1726882575.68988: stdout chunk (state=3): >>># destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno <<< 29946 1726882575.69009: stdout chunk (state=3): >>># cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 29946 1726882575.69017: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils <<< 29946 1726882575.69046: stdout chunk (state=3): >>># destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize <<< 29946 1726882575.69060: stdout chunk (state=3): >>># cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket <<< 29946 1726882575.69071: stdout chunk (state=3): >>># cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters <<< 29946 1726882575.69091: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 29946 1726882575.69125: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 29946 1726882575.69158: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux <<< 29946 1726882575.69225: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux <<< 29946 1726882575.69229: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat <<< 29946 1726882575.69234: stdout chunk (state=3): >>># cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 29946 1726882575.69551: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 29946 1726882575.69562: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 29946 1726882575.69586: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 29946 1726882575.69608: stdout chunk (state=3): >>># destroy _blake2 <<< 29946 1726882575.69614: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 29946 1726882575.69634: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 29946 1726882575.69663: stdout chunk (state=3): >>># destroy ntpath <<< 29946 1726882575.69705: stdout chunk (state=3): >>># destroy importlib <<< 29946 1726882575.69714: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 29946 1726882575.69721: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 29946 1726882575.69740: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal <<< 29946 1726882575.69751: stdout chunk (state=3): >>># destroy _posixsubprocess <<< 29946 1726882575.69757: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 29946 1726882575.69788: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 29946 1726882575.69809: stdout chunk (state=3): >>># destroy distro # destroy distro.distro <<< 29946 1726882575.69814: stdout chunk (state=3): >>># destroy argparse # destroy logging <<< 29946 1726882575.69845: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 29946 1726882575.69861: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context <<< 29946 1726882575.69877: stdout chunk (state=3): >>># destroy array # destroy _compat_pickle # destroy _pickle # destroy queue <<< 29946 1726882575.69891: stdout chunk (state=3): >>># destroy _heapq # destroy _queue <<< 29946 1726882575.69901: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util <<< 29946 1726882575.69928: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 29946 1726882575.69940: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 29946 1726882575.69946: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 29946 1726882575.69975: stdout chunk (state=3): >>># destroy _ssl <<< 29946 1726882575.69991: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 29946 1726882575.69999: stdout chunk (state=3): >>># destroy termios <<< 29946 1726882575.70004: stdout chunk (state=3): >>># destroy errno # destroy json <<< 29946 1726882575.70025: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 29946 1726882575.70031: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 29946 1726882575.70066: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 29946 1726882575.70097: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap <<< 29946 1726882575.70111: stdout chunk (state=3): >>># cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 29946 1726882575.70121: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 29946 1726882575.70142: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 29946 1726882575.70159: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 29946 1726882575.70185: stdout chunk (state=3): >>># destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 29946 1726882575.70196: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 29946 1726882575.70217: stdout chunk (state=3): >>># destroy posixpath <<< 29946 1726882575.70236: stdout chunk (state=3): >>># cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 29946 1726882575.70249: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 29946 1726882575.70258: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 29946 1726882575.70264: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 29946 1726882575.70397: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 29946 1726882575.70423: stdout chunk (state=3): >>># destroy _collections <<< 29946 1726882575.70441: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 29946 1726882575.70446: stdout chunk (state=3): >>># destroy tokenize <<< 29946 1726882575.70469: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 29946 1726882575.70503: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 29946 1726882575.70520: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 29946 1726882575.70535: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 29946 1726882575.70555: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29946 1726882575.70651: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 29946 1726882575.70655: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 29946 1726882575.70664: stdout chunk (state=3): >>># destroy time <<< 29946 1726882575.70697: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 29946 1726882575.70715: stdout chunk (state=3): >>># destroy _hashlib <<< 29946 1726882575.70722: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 29946 1726882575.70746: stdout chunk (state=3): >>># destroy itertools <<< 29946 1726882575.70753: stdout chunk (state=3): >>># destroy _abc <<< 29946 1726882575.70756: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 29946 1726882575.70766: stdout chunk (state=3): >>># clear sys.audit hooks <<< 29946 1726882575.71103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882575.71138: stderr chunk (state=3): >>><<< 29946 1726882575.71141: stdout chunk (state=3): >>><<< 29946 1726882575.71242: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995538bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99553cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ebe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ebef0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955223860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955223ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955203b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955201220> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e9010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552437a0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552423c0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552020f0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551ea8d0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552787d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e8290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955278c80> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955278b30> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955278f20> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99551e6db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552795e0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552792b0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99552906b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955291d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955292c00> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955293260> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955292150> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9955293ce0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955293410> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527a420> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954f87c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb07a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb0500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb06b0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb1040> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954fb1a30> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb08f0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954f85df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb2db0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb0ec0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995527abd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fdf0b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955003440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99550601d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955062930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99550602f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995502d1f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995502d9a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9955002240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954fb3cb0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f99550025a0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ofdzw_xs/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498efc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995496deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995496d070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498ce90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c2990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c2750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c2060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c24b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995498fc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c3710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99549c3950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549c3e60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954829bb0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995482b7d0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954830110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954831280> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954833d40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99551e6ea0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954832030> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954837d40> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836810> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836570> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954836ae0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954832540> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487bfb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487c4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487db20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487d8e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995487ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487e1e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548837d0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548801a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954884560> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99548847d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954884b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995487c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954710200> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954711850> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954886990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954887d40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99548865d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954715ac0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954716900> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954711af0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954716870> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954717a70> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954722360> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995471dc70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995480aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99549ea750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954885d60> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954885010> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b2990> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b0320> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b0680> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995479c440> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b34d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b10a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b0ce0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b3620> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b2ed0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f99543b30b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b2300> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99543b3770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954412270> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99544102c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f99547b0d40> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954413740> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954412d80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f995444e390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f995443eb70> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954462150> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954461d90> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9954263260> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954261e20> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9954262db0> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "15", "epoch": "1726882575", "epoch_int": "1726882575", "date": "2024-09-20", "time": "21:36:15", "iso8601_micro": "2024-09-21T01:36:15.675878Z", "iso8601": "2024-09-21T01:36:15Z", "iso8601_basic": "20240920T213615675878", "iso8601_basic_short": "20240920T213615", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 29946 1726882575.72011: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882575.72015: _low_level_execute_command(): starting 29946 1726882575.72017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882575.2502432-30013-207447013663272/ > /dev/null 2>&1 && sleep 0' 29946 1726882575.72070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.72074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.72076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.72078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882575.72080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.72134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.72137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.72142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.72204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.73991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.74020: stderr chunk (state=3): >>><<< 29946 1726882575.74023: stdout chunk (state=3): >>><<< 29946 1726882575.74039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.74044: handler run complete 29946 1726882575.74073: variable 'ansible_facts' from source: unknown 29946 1726882575.74112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.74182: variable 'ansible_facts' from source: unknown 29946 1726882575.74255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.74319: attempt loop complete, returning result 29946 1726882575.74322: _execute() done 29946 1726882575.74324: dumping result to json 29946 1726882575.74363: done dumping result, returning 29946 1726882575.74382: done running TaskExecutor() for managed_node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-95e7-9dfb-0000000000c0] 29946 1726882575.74384: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c0 29946 1726882575.74491: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c0 29946 1726882575.74495: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882575.74596: no more pending results, returning what we have 29946 1726882575.74599: results queue empty 29946 1726882575.74599: checking for any_errors_fatal 29946 1726882575.74601: done checking for any_errors_fatal 29946 1726882575.74601: checking for max_fail_percentage 29946 1726882575.74603: done checking for max_fail_percentage 29946 1726882575.74604: checking to see if all hosts have failed and the running result is not ok 29946 1726882575.74604: done checking to see if all hosts have failed 29946 1726882575.74605: getting the remaining hosts for this loop 29946 1726882575.74606: done getting the remaining hosts for this loop 29946 1726882575.74610: getting the next task for host managed_node2 29946 1726882575.74617: done getting next task for host managed_node2 29946 1726882575.74619: ^ task is: TASK: Check if system is ostree 29946 1726882575.74622: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882575.74625: getting variables 29946 1726882575.74626: in VariableManager get_vars() 29946 1726882575.74655: Calling all_inventory to load vars for managed_node2 29946 1726882575.74658: Calling groups_inventory to load vars for managed_node2 29946 1726882575.74660: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882575.74669: Calling all_plugins_play to load vars for managed_node2 29946 1726882575.74672: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882575.74674: Calling groups_plugins_play to load vars for managed_node2 29946 1726882575.74820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882575.74940: done with get_vars() 29946 1726882575.74948: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:36:15 -0400 (0:00:00.554) 0:00:01.859 ****** 29946 1726882575.75017: entering _queue_task() for managed_node2/stat 29946 1726882575.75212: worker is 1 (out of 1 available) 29946 1726882575.75225: exiting _queue_task() for managed_node2/stat 29946 1726882575.75236: done queuing things up, now waiting for results queue to drain 29946 1726882575.75237: waiting for pending results... 29946 1726882575.75369: running TaskExecutor() for managed_node2/TASK: Check if system is ostree 29946 1726882575.75431: in run() - task 12673a56-9f93-95e7-9dfb-0000000000c2 29946 1726882575.75442: variable 'ansible_search_path' from source: unknown 29946 1726882575.75446: variable 'ansible_search_path' from source: unknown 29946 1726882575.75472: calling self._execute() 29946 1726882575.75530: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.75534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.75542: variable 'omit' from source: magic vars 29946 1726882575.75999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882575.76195: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882575.76246: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882575.76283: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882575.76333: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882575.76428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882575.76457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882575.76489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882575.76527: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882575.76663: Evaluated conditional (not __network_is_ostree is defined): True 29946 1726882575.76702: variable 'omit' from source: magic vars 29946 1726882575.76724: variable 'omit' from source: magic vars 29946 1726882575.76775: variable 'omit' from source: magic vars 29946 1726882575.76810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882575.76857: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882575.76868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882575.76917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882575.76966: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882575.76969: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882575.76972: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.76980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.77088: Set connection var ansible_pipelining to False 29946 1726882575.77102: Set connection var ansible_shell_executable to /bin/sh 29946 1726882575.77112: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882575.77182: Set connection var ansible_timeout to 10 29946 1726882575.77185: Set connection var ansible_shell_type to sh 29946 1726882575.77190: Set connection var ansible_connection to ssh 29946 1726882575.77196: variable 'ansible_shell_executable' from source: unknown 29946 1726882575.77198: variable 'ansible_connection' from source: unknown 29946 1726882575.77201: variable 'ansible_module_compression' from source: unknown 29946 1726882575.77203: variable 'ansible_shell_type' from source: unknown 29946 1726882575.77204: variable 'ansible_shell_executable' from source: unknown 29946 1726882575.77206: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882575.77208: variable 'ansible_pipelining' from source: unknown 29946 1726882575.77210: variable 'ansible_timeout' from source: unknown 29946 1726882575.77212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882575.77365: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882575.77373: variable 'omit' from source: magic vars 29946 1726882575.77378: starting attempt loop 29946 1726882575.77384: running the handler 29946 1726882575.77424: _low_level_execute_command(): starting 29946 1726882575.77428: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882575.77899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.77903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.77906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882575.77909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.77951: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.77967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.78028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.79607: stdout chunk (state=3): >>>/root <<< 29946 1726882575.79841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.79844: stdout chunk (state=3): >>><<< 29946 1726882575.79847: stderr chunk (state=3): >>><<< 29946 1726882575.79851: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.79860: _low_level_execute_command(): starting 29946 1726882575.79862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437 `" && echo ansible-tmp-1726882575.7976153-30044-25928716060437="` echo /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437 `" ) && sleep 0' 29946 1726882575.80363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882575.80375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882575.80391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.80415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882575.80513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.80532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.80624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.82496: stdout chunk (state=3): >>>ansible-tmp-1726882575.7976153-30044-25928716060437=/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437 <<< 29946 1726882575.82637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.82646: stdout chunk (state=3): >>><<< 29946 1726882575.82659: stderr chunk (state=3): >>><<< 29946 1726882575.82680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882575.7976153-30044-25928716060437=/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.82734: variable 'ansible_module_compression' from source: unknown 29946 1726882575.82800: ANSIBALLZ: Using lock for stat 29946 1726882575.82808: ANSIBALLZ: Acquiring lock 29946 1726882575.82868: ANSIBALLZ: Lock acquired: 140626579265280 29946 1726882575.82871: ANSIBALLZ: Creating module 29946 1726882575.92329: ANSIBALLZ: Writing module into payload 29946 1726882575.92394: ANSIBALLZ: Writing module 29946 1726882575.92410: ANSIBALLZ: Renaming module 29946 1726882575.92416: ANSIBALLZ: Done creating module 29946 1726882575.92431: variable 'ansible_facts' from source: unknown 29946 1726882575.92484: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py 29946 1726882575.92576: Sending initial data 29946 1726882575.92580: Sent initial data (152 bytes) 29946 1726882575.93109: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.93139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.93157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.93177: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.93266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.94798: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882575.94806: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882575.94854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882575.94947: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpnhs2nxsy /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py <<< 29946 1726882575.94951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py" <<< 29946 1726882575.95004: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpnhs2nxsy" to remote "/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py" <<< 29946 1726882575.95895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.96049: stderr chunk (state=3): >>><<< 29946 1726882575.96052: stdout chunk (state=3): >>><<< 29946 1726882575.96054: done transferring module to remote 29946 1726882575.96056: _low_level_execute_command(): starting 29946 1726882575.96058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/ /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py && sleep 0' 29946 1726882575.96679: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882575.96712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882575.96727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882575.96760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882575.96830: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882575.96843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882575.96854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.96934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882575.98669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882575.98698: stderr chunk (state=3): >>><<< 29946 1726882575.98701: stdout chunk (state=3): >>><<< 29946 1726882575.98710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882575.98713: _low_level_execute_command(): starting 29946 1726882575.98718: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/AnsiballZ_stat.py && sleep 0' 29946 1726882575.99325: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882575.99332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882575.99401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882576.01524: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 29946 1726882576.01553: stdout chunk (state=3): >>>import _imp # builtin <<< 29946 1726882576.01582: stdout chunk (state=3): >>>import '_thread' # <<< 29946 1726882576.01591: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 29946 1726882576.01658: stdout chunk (state=3): >>>import '_io' # <<< 29946 1726882576.01663: stdout chunk (state=3): >>>import 'marshal' # <<< 29946 1726882576.01692: stdout chunk (state=3): >>>import 'posix' # <<< 29946 1726882576.01728: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 29946 1726882576.01759: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 29946 1726882576.01764: stdout chunk (state=3): >>># installed zipimport hook <<< 29946 1726882576.01819: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 29946 1726882576.01827: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.01836: stdout chunk (state=3): >>>import '_codecs' # <<< 29946 1726882576.01866: stdout chunk (state=3): >>>import 'codecs' # <<< 29946 1726882576.01894: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 29946 1726882576.01925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 29946 1726882576.01928: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1054184d0> <<< 29946 1726882576.01942: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1053e7b30> <<< 29946 1726882576.01966: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 29946 1726882576.01971: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10541aa50> <<< 29946 1726882576.01996: stdout chunk (state=3): >>>import '_signal' # <<< 29946 1726882576.02031: stdout chunk (state=3): >>>import '_abc' # <<< 29946 1726882576.02035: stdout chunk (state=3): >>>import 'abc' # <<< 29946 1726882576.02043: stdout chunk (state=3): >>>import 'io' # <<< 29946 1726882576.02076: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 29946 1726882576.02165: stdout chunk (state=3): >>>import '_collections_abc' # <<< 29946 1726882576.02191: stdout chunk (state=3): >>>import 'genericpath' # <<< 29946 1726882576.02197: stdout chunk (state=3): >>>import 'posixpath' # <<< 29946 1726882576.02219: stdout chunk (state=3): >>>import 'os' # <<< 29946 1726882576.02242: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 29946 1726882576.02262: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 29946 1726882576.02265: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 29946 1726882576.02287: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 29946 1726882576.02296: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 29946 1726882576.02313: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 29946 1726882576.02341: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1051c9130> <<< 29946 1726882576.02391: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 29946 1726882576.02409: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.02413: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1051c9fa0> <<< 29946 1726882576.02439: stdout chunk (state=3): >>>import 'site' # <<< 29946 1726882576.02467: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 29946 1726882576.02697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 29946 1726882576.02703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 29946 1726882576.02728: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 29946 1726882576.02731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.02756: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 29946 1726882576.02803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 29946 1726882576.02817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 29946 1726882576.02840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 29946 1726882576.02860: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105207e60> <<< 29946 1726882576.02868: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 29946 1726882576.02892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 29946 1726882576.02915: stdout chunk (state=3): >>>import '_operator' # <<< 29946 1726882576.02919: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105207f20> <<< 29946 1726882576.02935: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 29946 1726882576.02965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 29946 1726882576.02985: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 29946 1726882576.03038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.03050: stdout chunk (state=3): >>>import 'itertools' # <<< 29946 1726882576.03081: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10523f890> <<< 29946 1726882576.03102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 29946 1726882576.03115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 29946 1726882576.03130: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10523ff20> <<< 29946 1726882576.03134: stdout chunk (state=3): >>>import '_collections' # <<< 29946 1726882576.03183: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521fb30> <<< 29946 1726882576.03195: stdout chunk (state=3): >>>import '_functools' # <<< 29946 1726882576.03229: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521d250> <<< 29946 1726882576.03313: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105205010> <<< 29946 1726882576.03342: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 29946 1726882576.03358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 29946 1726882576.03372: stdout chunk (state=3): >>>import '_sre' # <<< 29946 1726882576.03392: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 29946 1726882576.03419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 29946 1726882576.03440: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 29946 1726882576.03450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 29946 1726882576.03475: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525f800> <<< 29946 1726882576.03496: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525e450> <<< 29946 1726882576.03522: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 29946 1726882576.03525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525ccb0> <<< 29946 1726882576.03590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 29946 1726882576.03595: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105294860> <<< 29946 1726882576.03612: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105204290> <<< 29946 1726882576.03629: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 29946 1726882576.03662: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.03667: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105294bc0> <<< 29946 1726882576.03711: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.03717: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105202db0> <<< 29946 1726882576.03751: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.03771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 29946 1726882576.03807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 29946 1726882576.03818: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052956a0> <<< 29946 1726882576.03827: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105295370> <<< 29946 1726882576.03832: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 29946 1726882576.03859: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 29946 1726882576.03881: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052965a0> <<< 29946 1726882576.03901: stdout chunk (state=3): >>>import 'importlib.util' # <<< 29946 1726882576.03906: stdout chunk (state=3): >>>import 'runpy' # <<< 29946 1726882576.03926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 29946 1726882576.03960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 29946 1726882576.03982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 29946 1726882576.03989: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052ac7a0> <<< 29946 1726882576.04012: stdout chunk (state=3): >>>import 'errno' # <<< 29946 1726882576.04039: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.04042: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052ade80> <<< 29946 1726882576.04070: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 29946 1726882576.04078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 29946 1726882576.04110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 29946 1726882576.04114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 29946 1726882576.04129: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052aed20> <<< 29946 1726882576.04160: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052af320> <<< 29946 1726882576.04169: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052ae270> <<< 29946 1726882576.04204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 29946 1726882576.04212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 29946 1726882576.04245: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.04251: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052afda0> <<< 29946 1726882576.04262: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052af4d0> <<< 29946 1726882576.04306: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105296510> <<< 29946 1726882576.04324: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 29946 1726882576.04354: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 29946 1726882576.04367: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 29946 1726882576.04398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 29946 1726882576.04421: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10503bbf0> <<< 29946 1726882576.04449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 29946 1726882576.04451: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 29946 1726882576.04478: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.04483: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050644a0> <<< 29946 1726882576.04512: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064680> <<< 29946 1726882576.04546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 29946 1726882576.04617: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.04735: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064fe0> <<< 29946 1726882576.04843: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105065910> <<< 29946 1726882576.04855: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050648c0> <<< 29946 1726882576.04871: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105039d90> <<< 29946 1726882576.04903: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 29946 1726882576.04912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 29946 1726882576.04942: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 29946 1726882576.04953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 29946 1726882576.04960: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105066d20> <<< 29946 1726882576.04980: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105065a60> <<< 29946 1726882576.05006: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105296750> <<< 29946 1726882576.05023: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 29946 1726882576.05084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.05104: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 29946 1726882576.05141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 29946 1726882576.05161: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10508f080> <<< 29946 1726882576.05222: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 29946 1726882576.05233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.05275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 29946 1726882576.05279: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 29946 1726882576.05329: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050b3440> <<< 29946 1726882576.05338: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 29946 1726882576.05381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 29946 1726882576.05437: stdout chunk (state=3): >>>import 'ntpath' # <<< 29946 1726882576.05465: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105114230> <<< 29946 1726882576.05474: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 29946 1726882576.05525: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 29946 1726882576.05539: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 29946 1726882576.05832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105116990> <<< 29946 1726882576.05871: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105114350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050e1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f1d310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050b2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105067c50> <<< 29946 1726882576.06117: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc104f1d5b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_vvu4onnn/ansible_stat_payload.zip' # zipimport: zlib available <<< 29946 1726882576.06233: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.06264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 29946 1726882576.06275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 29946 1726882576.06314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 29946 1726882576.06383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 29946 1726882576.06448: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f72fc0> import '_typing' # <<< 29946 1726882576.06609: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f51eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f510a0> <<< 29946 1726882576.06623: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.06770: stdout chunk (state=3): >>>import 'ansible' # <<< 29946 1726882576.06773: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.06794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 29946 1726882576.08112: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.09267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 29946 1726882576.09303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f712b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.09333: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 29946 1726882576.09361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 29946 1726882576.09374: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 29946 1726882576.09410: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9e9f0> <<< 29946 1726882576.09445: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e780> <<< 29946 1726882576.09473: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e090> <<< 29946 1726882576.09509: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 29946 1726882576.09547: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f73c50> <<< 29946 1726882576.09576: stdout chunk (state=3): >>>import 'atexit' # <<< 29946 1726882576.09607: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.09638: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9f770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9f9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 29946 1726882576.09713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 29946 1726882576.09721: stdout chunk (state=3): >>>import '_locale' # <<< 29946 1726882576.09759: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9fef0> import 'pwd' # <<< 29946 1726882576.09783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 29946 1726882576.09821: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 29946 1726882576.09848: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104911d00> <<< 29946 1726882576.09888: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104913920> <<< 29946 1726882576.09907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 29946 1726882576.09931: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 29946 1726882576.09967: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104914290> <<< 29946 1726882576.09977: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 29946 1726882576.10014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 29946 1726882576.10039: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104915430> <<< 29946 1726882576.10057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 29946 1726882576.10075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 29946 1726882576.10105: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 29946 1726882576.10174: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104917ef0> <<< 29946 1726882576.10206: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105207d10> <<< 29946 1726882576.10232: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049161b0> <<< 29946 1726882576.10243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 29946 1726882576.10269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 29946 1726882576.10306: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 29946 1726882576.10360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 29946 1726882576.10366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 29946 1726882576.10398: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 29946 1726882576.10412: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491fe60> import '_tokenize' # <<< 29946 1726882576.10492: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491e960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491e6c0> <<< 29946 1726882576.10510: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 29946 1726882576.10576: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491ec00> <<< 29946 1726882576.10628: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049166c0> <<< 29946 1726882576.10631: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.10685: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104967a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049681a0> <<< 29946 1726882576.10692: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 29946 1726882576.10733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 29946 1726882576.10778: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104969be0> <<< 29946 1726882576.10792: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049699a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 29946 1726882576.10925: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 29946 1726882576.10983: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.11006: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10496c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 29946 1726882576.11057: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.11084: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 29946 1726882576.11135: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496f860> <<< 29946 1726882576.11262: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496c230> <<< 29946 1726882576.11323: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970410> <<< 29946 1726882576.11352: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970aa0> <<< 29946 1726882576.11430: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970b30> <<< 29946 1726882576.11433: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104968380> <<< 29946 1726882576.11462: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 29946 1726882576.11485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 29946 1726882576.11511: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.11539: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1049f81a0> <<< 29946 1726882576.11702: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.11717: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1049f9250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104972930> <<< 29946 1726882576.11775: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104973ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104972570> # zipimport: zlib available <<< 29946 1726882576.11782: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 29946 1726882576.11809: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.11884: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.11994: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882576.12026: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 29946 1726882576.12037: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 29946 1726882576.12161: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.12278: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.12836: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.13377: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 29946 1726882576.13381: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 29946 1726882576.13412: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.13472: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 29946 1726882576.13483: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1048014f0> <<< 29946 1726882576.13567: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 29946 1726882576.13581: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1048022a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049f9430> <<< 29946 1726882576.13639: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 29946 1726882576.13660: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 29946 1726882576.13690: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 29946 1726882576.13834: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.14002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104802900> <<< 29946 1726882576.14018: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.14472: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.14914: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.14985: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15078: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 29946 1726882576.15081: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15108: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15149: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 29946 1726882576.15220: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15327: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 29946 1726882576.15331: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15353: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 29946 1726882576.15382: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15431: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 29946 1726882576.15442: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15666: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.15896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 29946 1726882576.15998: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 29946 1726882576.16050: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1048034d0> <<< 29946 1726882576.16060: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16127: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16207: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 29946 1726882576.16238: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 29946 1726882576.16241: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16282: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16331: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 29946 1726882576.16337: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16370: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16416: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16467: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16541: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 29946 1726882576.16579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 29946 1726882576.16684: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10480df10> <<< 29946 1726882576.16719: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10480bf80> <<< 29946 1726882576.16755: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 29946 1726882576.16767: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16831: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16886: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16918: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.16967: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 29946 1726882576.16982: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 29946 1726882576.17022: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 29946 1726882576.17025: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 29946 1726882576.17104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 29946 1726882576.17108: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 29946 1726882576.17131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 29946 1726882576.17176: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104fee8d0> <<< 29946 1726882576.17219: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104fe25a0> <<< 29946 1726882576.17313: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10480e030> <<< 29946 1726882576.17317: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496f5c0> # destroy ansible.module_utils.distro <<< 29946 1726882576.17359: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 29946 1726882576.17374: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 29946 1726882576.17463: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 29946 1726882576.17467: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 29946 1726882576.17496: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.17613: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.17816: stdout chunk (state=3): >>># zipimport: zlib available <<< 29946 1726882576.17939: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 29946 1726882576.18285: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 29946 1726882576.18315: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants <<< 29946 1726882576.18346: stdout chunk (state=3): >>># cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 <<< 29946 1726882576.18382: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 29946 1726882576.18422: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd <<< 29946 1726882576.18446: stdout chunk (state=3): >>># destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters <<< 29946 1726882576.18478: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 29946 1726882576.18744: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 29946 1726882576.18747: stdout chunk (state=3): >>># destroy _bz2 <<< 29946 1726882576.18762: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 29946 1726882576.18790: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport <<< 29946 1726882576.18831: stdout chunk (state=3): >>># destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 29946 1726882576.18885: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 29946 1726882576.18892: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 29946 1726882576.18915: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 29946 1726882576.18961: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 29946 1726882576.18996: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib <<< 29946 1726882576.19037: stdout chunk (state=3): >>># cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 29946 1726882576.19084: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 29946 1726882576.19111: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 29946 1726882576.19240: stdout chunk (state=3): >>># destroy sys.monitoring <<< 29946 1726882576.19286: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 29946 1726882576.19290: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 29946 1726882576.19342: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 29946 1726882576.19361: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 29946 1726882576.19374: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 29946 1726882576.19459: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 <<< 29946 1726882576.19498: stdout chunk (state=3): >>># destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 29946 1726882576.19531: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator <<< 29946 1726882576.19557: stdout chunk (state=3): >>># destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 29946 1726882576.19568: stdout chunk (state=3): >>># clear sys.audit hooks <<< 29946 1726882576.19909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882576.19922: stderr chunk (state=3): >>><<< 29946 1726882576.19930: stdout chunk (state=3): >>><<< 29946 1726882576.20040: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1054184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1053e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10541aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1051c9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1051c9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105207e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105207f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10523f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10523ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105205010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10521e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10525ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105294860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105204290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105294d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105294bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105294fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105202db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052956a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105295370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052965a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052ac7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052ade80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052aed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052af320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052ae270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1052afda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1052af4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105296510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10503bbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064740> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050644a0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105064fe0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105065910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050648c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105039d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105066d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105065a60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105296750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10508f080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050b3440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105114230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105116990> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105114350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050e1250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f1d310> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1050b2240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc105067c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc104f1d5b0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_vvu4onnn/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f72fc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f51eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f510a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f712b0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9e9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9e4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f73c50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9f770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104f9f9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104f9fef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104911d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104913920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104914290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104915430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104917ef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc105207d10> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049161b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491fe60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491e960> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491e6c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10491ec00> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049166c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104967a10> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049681a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104969be0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049699a0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10496c140> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496a2d0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970410> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104970b30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104968380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1049f81a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1049f9250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104972930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc104973ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104972570> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc1048014f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1048022a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1049f9430> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104802900> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc1048034d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc10480df10> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10480bf80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104fee8d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc104fe25a0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10480e030> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc10496f5c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 29946 1726882576.21153: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882576.21157: _low_level_execute_command(): starting 29946 1726882576.21160: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882575.7976153-30044-25928716060437/ > /dev/null 2>&1 && sleep 0' 29946 1726882576.21363: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882576.21379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882576.21396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882576.21416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882576.21491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882576.21563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882576.21580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882576.21614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882576.21712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882576.24306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882576.24323: stdout chunk (state=3): >>><<< 29946 1726882576.24326: stderr chunk (state=3): >>><<< 29946 1726882576.24329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882576.24333: handler run complete 29946 1726882576.24337: attempt loop complete, returning result 29946 1726882576.24339: _execute() done 29946 1726882576.24346: dumping result to json 29946 1726882576.24353: done dumping result, returning 29946 1726882576.24363: done running TaskExecutor() for managed_node2/TASK: Check if system is ostree [12673a56-9f93-95e7-9dfb-0000000000c2] 29946 1726882576.24368: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c2 29946 1726882576.24719: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c2 29946 1726882576.24724: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 29946 1726882576.24833: no more pending results, returning what we have 29946 1726882576.24842: results queue empty 29946 1726882576.24843: checking for any_errors_fatal 29946 1726882576.24859: done checking for any_errors_fatal 29946 1726882576.24860: checking for max_fail_percentage 29946 1726882576.24862: done checking for max_fail_percentage 29946 1726882576.24863: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.24864: done checking to see if all hosts have failed 29946 1726882576.24865: getting the remaining hosts for this loop 29946 1726882576.24867: done getting the remaining hosts for this loop 29946 1726882576.24884: getting the next task for host managed_node2 29946 1726882576.24903: done getting next task for host managed_node2 29946 1726882576.24928: ^ task is: TASK: Set flag to indicate system is ostree 29946 1726882576.24946: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.24951: getting variables 29946 1726882576.24953: in VariableManager get_vars() 29946 1726882576.25145: Calling all_inventory to load vars for managed_node2 29946 1726882576.25150: Calling groups_inventory to load vars for managed_node2 29946 1726882576.25154: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.25168: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.25173: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.25177: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.25511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.25916: done with get_vars() 29946 1726882576.25933: done getting variables 29946 1726882576.26091: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:36:16 -0400 (0:00:00.511) 0:00:02.371 ****** 29946 1726882576.26162: entering _queue_task() for managed_node2/set_fact 29946 1726882576.26165: Creating lock for set_fact 29946 1726882576.26505: worker is 1 (out of 1 available) 29946 1726882576.26517: exiting _queue_task() for managed_node2/set_fact 29946 1726882576.26529: done queuing things up, now waiting for results queue to drain 29946 1726882576.26530: waiting for pending results... 29946 1726882576.26765: running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree 29946 1726882576.26845: in run() - task 12673a56-9f93-95e7-9dfb-0000000000c3 29946 1726882576.26862: variable 'ansible_search_path' from source: unknown 29946 1726882576.26866: variable 'ansible_search_path' from source: unknown 29946 1726882576.26899: calling self._execute() 29946 1726882576.27081: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.27089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.27092: variable 'omit' from source: magic vars 29946 1726882576.27495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882576.27732: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882576.27770: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882576.27818: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882576.27849: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882576.27929: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882576.27952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882576.27982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882576.28016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882576.28125: Evaluated conditional (not __network_is_ostree is defined): True 29946 1726882576.28129: variable 'omit' from source: magic vars 29946 1726882576.28166: variable 'omit' from source: magic vars 29946 1726882576.28438: variable '__ostree_booted_stat' from source: set_fact 29946 1726882576.28442: variable 'omit' from source: magic vars 29946 1726882576.28444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882576.28447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882576.28449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882576.28451: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.28577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.28637: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882576.28640: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.28643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.28817: Set connection var ansible_pipelining to False 29946 1726882576.28909: Set connection var ansible_shell_executable to /bin/sh 29946 1726882576.28912: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882576.28923: Set connection var ansible_timeout to 10 29946 1726882576.28926: Set connection var ansible_shell_type to sh 29946 1726882576.28928: Set connection var ansible_connection to ssh 29946 1726882576.28954: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.28957: variable 'ansible_connection' from source: unknown 29946 1726882576.28960: variable 'ansible_module_compression' from source: unknown 29946 1726882576.28962: variable 'ansible_shell_type' from source: unknown 29946 1726882576.28964: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.28967: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.28969: variable 'ansible_pipelining' from source: unknown 29946 1726882576.28972: variable 'ansible_timeout' from source: unknown 29946 1726882576.28977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.29295: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882576.29299: variable 'omit' from source: magic vars 29946 1726882576.29302: starting attempt loop 29946 1726882576.29304: running the handler 29946 1726882576.29307: handler run complete 29946 1726882576.29309: attempt loop complete, returning result 29946 1726882576.29310: _execute() done 29946 1726882576.29313: dumping result to json 29946 1726882576.29314: done dumping result, returning 29946 1726882576.29317: done running TaskExecutor() for managed_node2/TASK: Set flag to indicate system is ostree [12673a56-9f93-95e7-9dfb-0000000000c3] 29946 1726882576.29359: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c3 29946 1726882576.29416: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c3 29946 1726882576.29419: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 29946 1726882576.29483: no more pending results, returning what we have 29946 1726882576.29489: results queue empty 29946 1726882576.29490: checking for any_errors_fatal 29946 1726882576.29498: done checking for any_errors_fatal 29946 1726882576.29499: checking for max_fail_percentage 29946 1726882576.29501: done checking for max_fail_percentage 29946 1726882576.29502: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.29502: done checking to see if all hosts have failed 29946 1726882576.29503: getting the remaining hosts for this loop 29946 1726882576.29504: done getting the remaining hosts for this loop 29946 1726882576.29508: getting the next task for host managed_node2 29946 1726882576.29517: done getting next task for host managed_node2 29946 1726882576.29520: ^ task is: TASK: Fix CentOS6 Base repo 29946 1726882576.29523: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.29527: getting variables 29946 1726882576.29528: in VariableManager get_vars() 29946 1726882576.29555: Calling all_inventory to load vars for managed_node2 29946 1726882576.29558: Calling groups_inventory to load vars for managed_node2 29946 1726882576.29561: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.29572: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.29575: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.29583: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.30168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.30579: done with get_vars() 29946 1726882576.30591: done getting variables 29946 1726882576.31009: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:36:16 -0400 (0:00:00.048) 0:00:02.419 ****** 29946 1726882576.31036: entering _queue_task() for managed_node2/copy 29946 1726882576.31469: worker is 1 (out of 1 available) 29946 1726882576.31481: exiting _queue_task() for managed_node2/copy 29946 1726882576.31497: done queuing things up, now waiting for results queue to drain 29946 1726882576.31498: waiting for pending results... 29946 1726882576.32108: running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo 29946 1726882576.32114: in run() - task 12673a56-9f93-95e7-9dfb-0000000000c5 29946 1726882576.32116: variable 'ansible_search_path' from source: unknown 29946 1726882576.32119: variable 'ansible_search_path' from source: unknown 29946 1726882576.32121: calling self._execute() 29946 1726882576.32308: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.32322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.32365: variable 'omit' from source: magic vars 29946 1726882576.33301: variable 'ansible_distribution' from source: facts 29946 1726882576.33431: Evaluated conditional (ansible_distribution == 'CentOS'): True 29946 1726882576.33642: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.33653: Evaluated conditional (ansible_distribution_major_version == '6'): False 29946 1726882576.33661: when evaluation is False, skipping this task 29946 1726882576.33669: _execute() done 29946 1726882576.33679: dumping result to json 29946 1726882576.33686: done dumping result, returning 29946 1726882576.33699: done running TaskExecutor() for managed_node2/TASK: Fix CentOS6 Base repo [12673a56-9f93-95e7-9dfb-0000000000c5] 29946 1726882576.33709: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c5 29946 1726882576.34028: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c5 29946 1726882576.34032: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 29946 1726882576.34097: no more pending results, returning what we have 29946 1726882576.34100: results queue empty 29946 1726882576.34101: checking for any_errors_fatal 29946 1726882576.34106: done checking for any_errors_fatal 29946 1726882576.34106: checking for max_fail_percentage 29946 1726882576.34108: done checking for max_fail_percentage 29946 1726882576.34109: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.34110: done checking to see if all hosts have failed 29946 1726882576.34110: getting the remaining hosts for this loop 29946 1726882576.34111: done getting the remaining hosts for this loop 29946 1726882576.34115: getting the next task for host managed_node2 29946 1726882576.34119: done getting next task for host managed_node2 29946 1726882576.34122: ^ task is: TASK: Include the task 'enable_epel.yml' 29946 1726882576.34124: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.34127: getting variables 29946 1726882576.34129: in VariableManager get_vars() 29946 1726882576.34154: Calling all_inventory to load vars for managed_node2 29946 1726882576.34156: Calling groups_inventory to load vars for managed_node2 29946 1726882576.34160: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.34171: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.34174: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.34177: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.34459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.35097: done with get_vars() 29946 1726882576.35108: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:36:16 -0400 (0:00:00.041) 0:00:02.461 ****** 29946 1726882576.35190: entering _queue_task() for managed_node2/include_tasks 29946 1726882576.35813: worker is 1 (out of 1 available) 29946 1726882576.35823: exiting _queue_task() for managed_node2/include_tasks 29946 1726882576.35833: done queuing things up, now waiting for results queue to drain 29946 1726882576.35835: waiting for pending results... 29946 1726882576.36044: running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' 29946 1726882576.36210: in run() - task 12673a56-9f93-95e7-9dfb-0000000000c6 29946 1726882576.36257: variable 'ansible_search_path' from source: unknown 29946 1726882576.36266: variable 'ansible_search_path' from source: unknown 29946 1726882576.36463: calling self._execute() 29946 1726882576.36491: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.36507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.36520: variable 'omit' from source: magic vars 29946 1726882576.37612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882576.42149: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882576.42599: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882576.42603: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882576.42605: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882576.42608: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882576.42999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882576.43003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882576.43006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882576.43008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882576.43011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882576.43217: variable '__network_is_ostree' from source: set_fact 29946 1726882576.43240: Evaluated conditional (not __network_is_ostree | d(false)): True 29946 1726882576.43249: _execute() done 29946 1726882576.43256: dumping result to json 29946 1726882576.43262: done dumping result, returning 29946 1726882576.43272: done running TaskExecutor() for managed_node2/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-95e7-9dfb-0000000000c6] 29946 1726882576.43279: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c6 29946 1726882576.43385: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000c6 29946 1726882576.43395: WORKER PROCESS EXITING 29946 1726882576.43428: no more pending results, returning what we have 29946 1726882576.43433: in VariableManager get_vars() 29946 1726882576.43465: Calling all_inventory to load vars for managed_node2 29946 1726882576.43468: Calling groups_inventory to load vars for managed_node2 29946 1726882576.43470: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.43480: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.43483: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.43485: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.43804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.44338: done with get_vars() 29946 1726882576.44346: variable 'ansible_search_path' from source: unknown 29946 1726882576.44347: variable 'ansible_search_path' from source: unknown 29946 1726882576.44503: we have included files to process 29946 1726882576.44505: generating all_blocks data 29946 1726882576.44507: done generating all_blocks data 29946 1726882576.44512: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29946 1726882576.44513: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29946 1726882576.44516: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 29946 1726882576.46432: done processing included file 29946 1726882576.46434: iterating over new_blocks loaded from include file 29946 1726882576.46436: in VariableManager get_vars() 29946 1726882576.46448: done with get_vars() 29946 1726882576.46450: filtering new block on tags 29946 1726882576.46472: done filtering new block on tags 29946 1726882576.46475: in VariableManager get_vars() 29946 1726882576.46489: done with get_vars() 29946 1726882576.46491: filtering new block on tags 29946 1726882576.46505: done filtering new block on tags 29946 1726882576.46507: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node2 29946 1726882576.46513: extending task lists for all hosts with included blocks 29946 1726882576.46949: done extending task lists 29946 1726882576.46950: done processing included files 29946 1726882576.46951: results queue empty 29946 1726882576.46952: checking for any_errors_fatal 29946 1726882576.46956: done checking for any_errors_fatal 29946 1726882576.46957: checking for max_fail_percentage 29946 1726882576.46958: done checking for max_fail_percentage 29946 1726882576.46958: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.46959: done checking to see if all hosts have failed 29946 1726882576.46960: getting the remaining hosts for this loop 29946 1726882576.46961: done getting the remaining hosts for this loop 29946 1726882576.46963: getting the next task for host managed_node2 29946 1726882576.46967: done getting next task for host managed_node2 29946 1726882576.46970: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 29946 1726882576.46972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.46974: getting variables 29946 1726882576.46975: in VariableManager get_vars() 29946 1726882576.46983: Calling all_inventory to load vars for managed_node2 29946 1726882576.46985: Calling groups_inventory to load vars for managed_node2 29946 1726882576.46990: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.46997: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.47004: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.47007: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.47952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.48753: done with get_vars() 29946 1726882576.48762: done getting variables 29946 1726882576.48831: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 29946 1726882576.49465: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:36:16 -0400 (0:00:00.146) 0:00:02.608 ****** 29946 1726882576.49848: entering _queue_task() for managed_node2/command 29946 1726882576.49850: Creating lock for command 29946 1726882576.50676: worker is 1 (out of 1 available) 29946 1726882576.50692: exiting _queue_task() for managed_node2/command 29946 1726882576.51004: done queuing things up, now waiting for results queue to drain 29946 1726882576.51005: waiting for pending results... 29946 1726882576.51134: running TaskExecutor() for managed_node2/TASK: Create EPEL 10 29946 1726882576.51242: in run() - task 12673a56-9f93-95e7-9dfb-0000000000e0 29946 1726882576.51413: variable 'ansible_search_path' from source: unknown 29946 1726882576.51422: variable 'ansible_search_path' from source: unknown 29946 1726882576.51461: calling self._execute() 29946 1726882576.51531: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.51797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.51802: variable 'omit' from source: magic vars 29946 1726882576.52273: variable 'ansible_distribution' from source: facts 29946 1726882576.52599: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29946 1726882576.52644: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.52656: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29946 1726882576.52663: when evaluation is False, skipping this task 29946 1726882576.52668: _execute() done 29946 1726882576.52674: dumping result to json 29946 1726882576.52680: done dumping result, returning 29946 1726882576.52689: done running TaskExecutor() for managed_node2/TASK: Create EPEL 10 [12673a56-9f93-95e7-9dfb-0000000000e0] 29946 1726882576.52700: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e0 skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29946 1726882576.52856: no more pending results, returning what we have 29946 1726882576.52859: results queue empty 29946 1726882576.52860: checking for any_errors_fatal 29946 1726882576.52862: done checking for any_errors_fatal 29946 1726882576.52863: checking for max_fail_percentage 29946 1726882576.52866: done checking for max_fail_percentage 29946 1726882576.52867: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.52868: done checking to see if all hosts have failed 29946 1726882576.52868: getting the remaining hosts for this loop 29946 1726882576.52870: done getting the remaining hosts for this loop 29946 1726882576.52873: getting the next task for host managed_node2 29946 1726882576.52880: done getting next task for host managed_node2 29946 1726882576.52883: ^ task is: TASK: Install yum-utils package 29946 1726882576.52890: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.52896: getting variables 29946 1726882576.52897: in VariableManager get_vars() 29946 1726882576.52927: Calling all_inventory to load vars for managed_node2 29946 1726882576.52930: Calling groups_inventory to load vars for managed_node2 29946 1726882576.52933: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.52947: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.52950: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.52952: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.53560: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e0 29946 1726882576.53564: WORKER PROCESS EXITING 29946 1726882576.53578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.54275: done with get_vars() 29946 1726882576.54284: done getting variables 29946 1726882576.54576: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:36:16 -0400 (0:00:00.050) 0:00:02.658 ****** 29946 1726882576.54871: entering _queue_task() for managed_node2/package 29946 1726882576.54873: Creating lock for package 29946 1726882576.55696: worker is 1 (out of 1 available) 29946 1726882576.55711: exiting _queue_task() for managed_node2/package 29946 1726882576.55724: done queuing things up, now waiting for results queue to drain 29946 1726882576.55725: waiting for pending results... 29946 1726882576.56127: running TaskExecutor() for managed_node2/TASK: Install yum-utils package 29946 1726882576.56233: in run() - task 12673a56-9f93-95e7-9dfb-0000000000e1 29946 1726882576.56499: variable 'ansible_search_path' from source: unknown 29946 1726882576.56502: variable 'ansible_search_path' from source: unknown 29946 1726882576.56505: calling self._execute() 29946 1726882576.56627: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.56640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.56653: variable 'omit' from source: magic vars 29946 1726882576.57702: variable 'ansible_distribution' from source: facts 29946 1726882576.57705: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29946 1726882576.57708: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.57710: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29946 1726882576.57713: when evaluation is False, skipping this task 29946 1726882576.57715: _execute() done 29946 1726882576.57717: dumping result to json 29946 1726882576.57720: done dumping result, returning 29946 1726882576.57723: done running TaskExecutor() for managed_node2/TASK: Install yum-utils package [12673a56-9f93-95e7-9dfb-0000000000e1] 29946 1726882576.57725: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e1 29946 1726882576.57792: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e1 29946 1726882576.57797: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29946 1726882576.57852: no more pending results, returning what we have 29946 1726882576.57855: results queue empty 29946 1726882576.57856: checking for any_errors_fatal 29946 1726882576.57862: done checking for any_errors_fatal 29946 1726882576.57863: checking for max_fail_percentage 29946 1726882576.57865: done checking for max_fail_percentage 29946 1726882576.57866: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.57867: done checking to see if all hosts have failed 29946 1726882576.57867: getting the remaining hosts for this loop 29946 1726882576.57869: done getting the remaining hosts for this loop 29946 1726882576.57872: getting the next task for host managed_node2 29946 1726882576.57885: done getting next task for host managed_node2 29946 1726882576.57890: ^ task is: TASK: Enable EPEL 7 29946 1726882576.57897: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.57901: getting variables 29946 1726882576.57902: in VariableManager get_vars() 29946 1726882576.57932: Calling all_inventory to load vars for managed_node2 29946 1726882576.57934: Calling groups_inventory to load vars for managed_node2 29946 1726882576.57938: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.57951: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.57954: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.57957: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.58481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.59235: done with get_vars() 29946 1726882576.59245: done getting variables 29946 1726882576.59614: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:36:16 -0400 (0:00:00.047) 0:00:02.705 ****** 29946 1726882576.59641: entering _queue_task() for managed_node2/command 29946 1726882576.60090: worker is 1 (out of 1 available) 29946 1726882576.60104: exiting _queue_task() for managed_node2/command 29946 1726882576.60115: done queuing things up, now waiting for results queue to drain 29946 1726882576.60116: waiting for pending results... 29946 1726882576.60745: running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 29946 1726882576.61134: in run() - task 12673a56-9f93-95e7-9dfb-0000000000e2 29946 1726882576.61149: variable 'ansible_search_path' from source: unknown 29946 1726882576.61153: variable 'ansible_search_path' from source: unknown 29946 1726882576.61184: calling self._execute() 29946 1726882576.61570: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.61576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.61585: variable 'omit' from source: magic vars 29946 1726882576.62766: variable 'ansible_distribution' from source: facts 29946 1726882576.62778: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29946 1726882576.63300: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.63306: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29946 1726882576.63309: when evaluation is False, skipping this task 29946 1726882576.63312: _execute() done 29946 1726882576.63314: dumping result to json 29946 1726882576.63317: done dumping result, returning 29946 1726882576.63323: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 7 [12673a56-9f93-95e7-9dfb-0000000000e2] 29946 1726882576.63328: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e2 29946 1726882576.63422: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e2 29946 1726882576.63425: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29946 1726882576.63638: no more pending results, returning what we have 29946 1726882576.63641: results queue empty 29946 1726882576.63642: checking for any_errors_fatal 29946 1726882576.63649: done checking for any_errors_fatal 29946 1726882576.63649: checking for max_fail_percentage 29946 1726882576.63651: done checking for max_fail_percentage 29946 1726882576.63651: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.63652: done checking to see if all hosts have failed 29946 1726882576.63653: getting the remaining hosts for this loop 29946 1726882576.63654: done getting the remaining hosts for this loop 29946 1726882576.63657: getting the next task for host managed_node2 29946 1726882576.63664: done getting next task for host managed_node2 29946 1726882576.63666: ^ task is: TASK: Enable EPEL 8 29946 1726882576.63670: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.63674: getting variables 29946 1726882576.63675: in VariableManager get_vars() 29946 1726882576.63707: Calling all_inventory to load vars for managed_node2 29946 1726882576.63710: Calling groups_inventory to load vars for managed_node2 29946 1726882576.63714: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.63727: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.63730: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.63733: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.64154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.64557: done with get_vars() 29946 1726882576.64567: done getting variables 29946 1726882576.64829: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:36:16 -0400 (0:00:00.052) 0:00:02.758 ****** 29946 1726882576.64857: entering _queue_task() for managed_node2/command 29946 1726882576.65316: worker is 1 (out of 1 available) 29946 1726882576.65330: exiting _queue_task() for managed_node2/command 29946 1726882576.65341: done queuing things up, now waiting for results queue to drain 29946 1726882576.65343: waiting for pending results... 29946 1726882576.65812: running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 29946 1726882576.66038: in run() - task 12673a56-9f93-95e7-9dfb-0000000000e3 29946 1726882576.66061: variable 'ansible_search_path' from source: unknown 29946 1726882576.66069: variable 'ansible_search_path' from source: unknown 29946 1726882576.66111: calling self._execute() 29946 1726882576.66500: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.66504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.66507: variable 'omit' from source: magic vars 29946 1726882576.67148: variable 'ansible_distribution' from source: facts 29946 1726882576.67168: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29946 1726882576.67497: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.67508: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 29946 1726882576.67516: when evaluation is False, skipping this task 29946 1726882576.67522: _execute() done 29946 1726882576.67528: dumping result to json 29946 1726882576.67535: done dumping result, returning 29946 1726882576.67545: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 8 [12673a56-9f93-95e7-9dfb-0000000000e3] 29946 1726882576.67552: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e3 29946 1726882576.67801: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e3 29946 1726882576.67805: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 29946 1726882576.67846: no more pending results, returning what we have 29946 1726882576.67850: results queue empty 29946 1726882576.67851: checking for any_errors_fatal 29946 1726882576.67856: done checking for any_errors_fatal 29946 1726882576.67857: checking for max_fail_percentage 29946 1726882576.67859: done checking for max_fail_percentage 29946 1726882576.67859: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.67860: done checking to see if all hosts have failed 29946 1726882576.67861: getting the remaining hosts for this loop 29946 1726882576.67862: done getting the remaining hosts for this loop 29946 1726882576.67866: getting the next task for host managed_node2 29946 1726882576.67874: done getting next task for host managed_node2 29946 1726882576.67877: ^ task is: TASK: Enable EPEL 6 29946 1726882576.67881: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.67884: getting variables 29946 1726882576.67886: in VariableManager get_vars() 29946 1726882576.67916: Calling all_inventory to load vars for managed_node2 29946 1726882576.67919: Calling groups_inventory to load vars for managed_node2 29946 1726882576.67922: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.67934: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.67937: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.67940: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.68229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.68836: done with get_vars() 29946 1726882576.68846: done getting variables 29946 1726882576.69113: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:36:16 -0400 (0:00:00.042) 0:00:02.800 ****** 29946 1726882576.69149: entering _queue_task() for managed_node2/copy 29946 1726882576.69566: worker is 1 (out of 1 available) 29946 1726882576.69577: exiting _queue_task() for managed_node2/copy 29946 1726882576.69587: done queuing things up, now waiting for results queue to drain 29946 1726882576.69589: waiting for pending results... 29946 1726882576.70210: running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 29946 1726882576.70215: in run() - task 12673a56-9f93-95e7-9dfb-0000000000e5 29946 1726882576.70218: variable 'ansible_search_path' from source: unknown 29946 1726882576.70221: variable 'ansible_search_path' from source: unknown 29946 1726882576.70276: calling self._execute() 29946 1726882576.70347: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.70354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.70363: variable 'omit' from source: magic vars 29946 1726882576.71319: variable 'ansible_distribution' from source: facts 29946 1726882576.71332: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 29946 1726882576.71770: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.71774: Evaluated conditional (ansible_distribution_major_version == '6'): False 29946 1726882576.71779: when evaluation is False, skipping this task 29946 1726882576.71903: _execute() done 29946 1726882576.71907: dumping result to json 29946 1726882576.71909: done dumping result, returning 29946 1726882576.71915: done running TaskExecutor() for managed_node2/TASK: Enable EPEL 6 [12673a56-9f93-95e7-9dfb-0000000000e5] 29946 1726882576.71919: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e5 29946 1726882576.72200: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000e5 29946 1726882576.72203: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 29946 1726882576.72245: no more pending results, returning what we have 29946 1726882576.72248: results queue empty 29946 1726882576.72249: checking for any_errors_fatal 29946 1726882576.72255: done checking for any_errors_fatal 29946 1726882576.72256: checking for max_fail_percentage 29946 1726882576.72258: done checking for max_fail_percentage 29946 1726882576.72258: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.72259: done checking to see if all hosts have failed 29946 1726882576.72260: getting the remaining hosts for this loop 29946 1726882576.72261: done getting the remaining hosts for this loop 29946 1726882576.72265: getting the next task for host managed_node2 29946 1726882576.72273: done getting next task for host managed_node2 29946 1726882576.72275: ^ task is: TASK: Set network provider to 'nm' 29946 1726882576.72277: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.72281: getting variables 29946 1726882576.72282: in VariableManager get_vars() 29946 1726882576.72311: Calling all_inventory to load vars for managed_node2 29946 1726882576.72314: Calling groups_inventory to load vars for managed_node2 29946 1726882576.72317: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.72327: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.72330: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.72333: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.72638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.73038: done with get_vars() 29946 1726882576.73047: done getting variables 29946 1726882576.73301: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:13 Friday 20 September 2024 21:36:16 -0400 (0:00:00.041) 0:00:02.842 ****** 29946 1726882576.73326: entering _queue_task() for managed_node2/set_fact 29946 1726882576.73749: worker is 1 (out of 1 available) 29946 1726882576.73761: exiting _queue_task() for managed_node2/set_fact 29946 1726882576.73774: done queuing things up, now waiting for results queue to drain 29946 1726882576.73775: waiting for pending results... 29946 1726882576.74342: running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' 29946 1726882576.74635: in run() - task 12673a56-9f93-95e7-9dfb-000000000007 29946 1726882576.74653: variable 'ansible_search_path' from source: unknown 29946 1726882576.74683: calling self._execute() 29946 1726882576.74970: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.75000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.75004: variable 'omit' from source: magic vars 29946 1726882576.75400: variable 'omit' from source: magic vars 29946 1726882576.75403: variable 'omit' from source: magic vars 29946 1726882576.75561: variable 'omit' from source: magic vars 29946 1726882576.75565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882576.75698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882576.75702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882576.75891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.75896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.75899: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882576.75901: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.75903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.76134: Set connection var ansible_pipelining to False 29946 1726882576.76145: Set connection var ansible_shell_executable to /bin/sh 29946 1726882576.76179: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882576.76541: Set connection var ansible_timeout to 10 29946 1726882576.76544: Set connection var ansible_shell_type to sh 29946 1726882576.76546: Set connection var ansible_connection to ssh 29946 1726882576.76548: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.76550: variable 'ansible_connection' from source: unknown 29946 1726882576.76552: variable 'ansible_module_compression' from source: unknown 29946 1726882576.76554: variable 'ansible_shell_type' from source: unknown 29946 1726882576.76556: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.76558: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.76560: variable 'ansible_pipelining' from source: unknown 29946 1726882576.76562: variable 'ansible_timeout' from source: unknown 29946 1726882576.76564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.77089: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882576.77096: variable 'omit' from source: magic vars 29946 1726882576.77099: starting attempt loop 29946 1726882576.77102: running the handler 29946 1726882576.77104: handler run complete 29946 1726882576.77106: attempt loop complete, returning result 29946 1726882576.77108: _execute() done 29946 1726882576.77110: dumping result to json 29946 1726882576.77112: done dumping result, returning 29946 1726882576.77120: done running TaskExecutor() for managed_node2/TASK: Set network provider to 'nm' [12673a56-9f93-95e7-9dfb-000000000007] 29946 1726882576.77129: sending task result for task 12673a56-9f93-95e7-9dfb-000000000007 ok: [managed_node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 29946 1726882576.77334: no more pending results, returning what we have 29946 1726882576.77337: results queue empty 29946 1726882576.77337: checking for any_errors_fatal 29946 1726882576.77343: done checking for any_errors_fatal 29946 1726882576.77344: checking for max_fail_percentage 29946 1726882576.77346: done checking for max_fail_percentage 29946 1726882576.77346: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.77347: done checking to see if all hosts have failed 29946 1726882576.77348: getting the remaining hosts for this loop 29946 1726882576.77349: done getting the remaining hosts for this loop 29946 1726882576.77352: getting the next task for host managed_node2 29946 1726882576.77359: done getting next task for host managed_node2 29946 1726882576.77360: ^ task is: TASK: meta (flush_handlers) 29946 1726882576.77362: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.77366: getting variables 29946 1726882576.77367: in VariableManager get_vars() 29946 1726882576.77396: Calling all_inventory to load vars for managed_node2 29946 1726882576.77398: Calling groups_inventory to load vars for managed_node2 29946 1726882576.77402: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.77412: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.77416: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.77419: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.77804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.78209: done with get_vars() 29946 1726882576.78218: done getting variables 29946 1726882576.78256: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000007 29946 1726882576.78260: WORKER PROCESS EXITING 29946 1726882576.78528: in VariableManager get_vars() 29946 1726882576.78536: Calling all_inventory to load vars for managed_node2 29946 1726882576.78538: Calling groups_inventory to load vars for managed_node2 29946 1726882576.78541: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.78545: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.78547: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.78550: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.78719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.79119: done with get_vars() 29946 1726882576.79133: done queuing things up, now waiting for results queue to drain 29946 1726882576.79134: results queue empty 29946 1726882576.79135: checking for any_errors_fatal 29946 1726882576.79137: done checking for any_errors_fatal 29946 1726882576.79138: checking for max_fail_percentage 29946 1726882576.79139: done checking for max_fail_percentage 29946 1726882576.79139: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.79140: done checking to see if all hosts have failed 29946 1726882576.79141: getting the remaining hosts for this loop 29946 1726882576.79142: done getting the remaining hosts for this loop 29946 1726882576.79144: getting the next task for host managed_node2 29946 1726882576.79147: done getting next task for host managed_node2 29946 1726882576.79148: ^ task is: TASK: meta (flush_handlers) 29946 1726882576.79149: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.79157: getting variables 29946 1726882576.79158: in VariableManager get_vars() 29946 1726882576.79165: Calling all_inventory to load vars for managed_node2 29946 1726882576.79167: Calling groups_inventory to load vars for managed_node2 29946 1726882576.79169: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.79173: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.79175: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.79178: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.79526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.79928: done with get_vars() 29946 1726882576.79936: done getting variables 29946 1726882576.79975: in VariableManager get_vars() 29946 1726882576.79982: Calling all_inventory to load vars for managed_node2 29946 1726882576.79984: Calling groups_inventory to load vars for managed_node2 29946 1726882576.79987: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.79990: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.79994: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.79997: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.80364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.80760: done with get_vars() 29946 1726882576.80770: done queuing things up, now waiting for results queue to drain 29946 1726882576.80772: results queue empty 29946 1726882576.80773: checking for any_errors_fatal 29946 1726882576.80774: done checking for any_errors_fatal 29946 1726882576.80775: checking for max_fail_percentage 29946 1726882576.80776: done checking for max_fail_percentage 29946 1726882576.80776: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.80777: done checking to see if all hosts have failed 29946 1726882576.80778: getting the remaining hosts for this loop 29946 1726882576.80779: done getting the remaining hosts for this loop 29946 1726882576.80781: getting the next task for host managed_node2 29946 1726882576.80783: done getting next task for host managed_node2 29946 1726882576.80784: ^ task is: None 29946 1726882576.80786: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.80787: done queuing things up, now waiting for results queue to drain 29946 1726882576.80787: results queue empty 29946 1726882576.80788: checking for any_errors_fatal 29946 1726882576.80789: done checking for any_errors_fatal 29946 1726882576.80789: checking for max_fail_percentage 29946 1726882576.80790: done checking for max_fail_percentage 29946 1726882576.80791: checking to see if all hosts have failed and the running result is not ok 29946 1726882576.80791: done checking to see if all hosts have failed 29946 1726882576.80795: getting the next task for host managed_node2 29946 1726882576.80797: done getting next task for host managed_node2 29946 1726882576.80798: ^ task is: None 29946 1726882576.80799: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.81010: in VariableManager get_vars() 29946 1726882576.81030: done with get_vars() 29946 1726882576.81036: in VariableManager get_vars() 29946 1726882576.81047: done with get_vars() 29946 1726882576.81051: variable 'omit' from source: magic vars 29946 1726882576.81080: in VariableManager get_vars() 29946 1726882576.81095: done with get_vars() 29946 1726882576.81116: variable 'omit' from source: magic vars PLAY [Test for testing routing rules] ****************************************** 29946 1726882576.81802: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29946 1726882576.81844: getting the remaining hosts for this loop 29946 1726882576.81845: done getting the remaining hosts for this loop 29946 1726882576.81848: getting the next task for host managed_node2 29946 1726882576.81851: done getting next task for host managed_node2 29946 1726882576.81853: ^ task is: TASK: Gathering Facts 29946 1726882576.81854: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882576.81856: getting variables 29946 1726882576.81857: in VariableManager get_vars() 29946 1726882576.81867: Calling all_inventory to load vars for managed_node2 29946 1726882576.81869: Calling groups_inventory to load vars for managed_node2 29946 1726882576.81871: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882576.81875: Calling all_plugins_play to load vars for managed_node2 29946 1726882576.81888: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882576.81892: Calling groups_plugins_play to load vars for managed_node2 29946 1726882576.82242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882576.82637: done with get_vars() 29946 1726882576.82646: done getting variables 29946 1726882576.82684: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Friday 20 September 2024 21:36:16 -0400 (0:00:00.093) 0:00:02.936 ****** 29946 1726882576.82708: entering _queue_task() for managed_node2/gather_facts 29946 1726882576.83367: worker is 1 (out of 1 available) 29946 1726882576.83379: exiting _queue_task() for managed_node2/gather_facts 29946 1726882576.83391: done queuing things up, now waiting for results queue to drain 29946 1726882576.83395: waiting for pending results... 29946 1726882576.84029: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882576.84033: in run() - task 12673a56-9f93-95e7-9dfb-00000000010b 29946 1726882576.84037: variable 'ansible_search_path' from source: unknown 29946 1726882576.84065: calling self._execute() 29946 1726882576.84241: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.84247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.84255: variable 'omit' from source: magic vars 29946 1726882576.85100: variable 'ansible_distribution_major_version' from source: facts 29946 1726882576.85103: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882576.85106: variable 'omit' from source: magic vars 29946 1726882576.85108: variable 'omit' from source: magic vars 29946 1726882576.85111: variable 'omit' from source: magic vars 29946 1726882576.85227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882576.85259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882576.85278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882576.85296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.85425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882576.85534: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882576.85537: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.85540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.85659: Set connection var ansible_pipelining to False 29946 1726882576.85898: Set connection var ansible_shell_executable to /bin/sh 29946 1726882576.85902: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882576.85904: Set connection var ansible_timeout to 10 29946 1726882576.85907: Set connection var ansible_shell_type to sh 29946 1726882576.85909: Set connection var ansible_connection to ssh 29946 1726882576.85911: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.85913: variable 'ansible_connection' from source: unknown 29946 1726882576.85915: variable 'ansible_module_compression' from source: unknown 29946 1726882576.85917: variable 'ansible_shell_type' from source: unknown 29946 1726882576.85919: variable 'ansible_shell_executable' from source: unknown 29946 1726882576.85921: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882576.85923: variable 'ansible_pipelining' from source: unknown 29946 1726882576.85924: variable 'ansible_timeout' from source: unknown 29946 1726882576.85926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882576.86284: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882576.86292: variable 'omit' from source: magic vars 29946 1726882576.86299: starting attempt loop 29946 1726882576.86302: running the handler 29946 1726882576.86432: variable 'ansible_facts' from source: unknown 29946 1726882576.86453: _low_level_execute_command(): starting 29946 1726882576.86463: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882576.87958: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882576.88214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882576.88228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882576.88284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882576.88430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882576.88799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882576.90513: stdout chunk (state=3): >>>/root <<< 29946 1726882576.90810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882576.90849: stderr chunk (state=3): >>><<< 29946 1726882576.90860: stdout chunk (state=3): >>><<< 29946 1726882576.90896: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882576.90919: _low_level_execute_command(): starting 29946 1726882576.91071: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650 `" && echo ansible-tmp-1726882576.9090405-30096-129069896594650="` echo /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650 `" ) && sleep 0' 29946 1726882576.91984: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882576.92298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882576.92328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882576.92363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882576.92428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882576.94398: stdout chunk (state=3): >>>ansible-tmp-1726882576.9090405-30096-129069896594650=/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650 <<< 29946 1726882576.94539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882576.94550: stdout chunk (state=3): >>><<< 29946 1726882576.94562: stderr chunk (state=3): >>><<< 29946 1726882576.94584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882576.9090405-30096-129069896594650=/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882576.94899: variable 'ansible_module_compression' from source: unknown 29946 1726882576.94902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882576.94904: variable 'ansible_facts' from source: unknown 29946 1726882576.95217: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py 29946 1726882576.95527: Sending initial data 29946 1726882576.95536: Sent initial data (154 bytes) 29946 1726882576.96771: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882576.97038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882576.97096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882576.98700: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882576.98781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882576.98866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmppp3_nko2 /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py <<< 29946 1726882576.98870: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py" <<< 29946 1726882576.98948: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmppp3_nko2" to remote "/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py" <<< 29946 1726882577.01215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882577.01228: stdout chunk (state=3): >>><<< 29946 1726882577.01239: stderr chunk (state=3): >>><<< 29946 1726882577.01263: done transferring module to remote 29946 1726882577.01278: _low_level_execute_command(): starting 29946 1726882577.01297: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/ /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py && sleep 0' 29946 1726882577.02674: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882577.02779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882577.02884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882577.02982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882577.04848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882577.04957: stderr chunk (state=3): >>><<< 29946 1726882577.04969: stdout chunk (state=3): >>><<< 29946 1726882577.04998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882577.05013: _low_level_execute_command(): starting 29946 1726882577.05023: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/AnsiballZ_setup.py && sleep 0' 29946 1726882577.06235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882577.06345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882577.06495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882577.06525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882577.06623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882577.71600: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.63330078125, "5m": 0.5224609375, "15m": 0.2919921875}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "17", "epoch": "1726882577", "epoch_int": "1726882577", "date": "2024-09-20", "time": "21:36:17", "iso8601_micro": "2024-09-21T01:36:17.347948Z", "iso8601": "2024-09-21T01:36:17Z", "iso8601_basic": "20240920T213617347948", "iso8601_basic_short": "20240920T213617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": <<< 29946 1726882577.71790: stdout chunk (state=3): >>>"off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 767, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789876224, "block_size": 4096, "block_total": 65519099, "block_available": 63913544, "block_used": 1605555, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882577.73700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882577.73764: stderr chunk (state=3): >>><<< 29946 1726882577.73774: stdout chunk (state=3): >>><<< 29946 1726882577.74002: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.63330078125, "5m": 0.5224609375, "15m": 0.2919921875}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "17", "epoch": "1726882577", "epoch_int": "1726882577", "date": "2024-09-20", "time": "21:36:17", "iso8601_micro": "2024-09-21T01:36:17.347948Z", "iso8601": "2024-09-21T01:36:17Z", "iso8601_basic": "20240920T213617347948", "iso8601_basic_short": "20240920T213617", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_interfaces": ["rpltstbr", "eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 767, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789876224, "block_size": 4096, "block_total": 65519099, "block_available": 63913544, "block_used": 1605555, "inode_total": 131070960, "inode_available": 131029051, "inode_used": 41909, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882577.74581: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882577.74734: _low_level_execute_command(): starting 29946 1726882577.74738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882576.9090405-30096-129069896594650/ > /dev/null 2>&1 && sleep 0' 29946 1726882577.76100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882577.76201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882577.76333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882577.76356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882577.76452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882577.78367: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882577.78613: stderr chunk (state=3): >>><<< 29946 1726882577.78616: stdout chunk (state=3): >>><<< 29946 1726882577.78638: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882577.78652: handler run complete 29946 1726882577.78797: variable 'ansible_facts' from source: unknown 29946 1726882577.79620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.80559: variable 'ansible_facts' from source: unknown 29946 1726882577.80801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.81358: attempt loop complete, returning result 29946 1726882577.81361: _execute() done 29946 1726882577.81363: dumping result to json 29946 1726882577.81365: done dumping result, returning 29946 1726882577.81367: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-00000000010b] 29946 1726882577.81369: sending task result for task 12673a56-9f93-95e7-9dfb-00000000010b 29946 1726882577.82413: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000010b 29946 1726882577.82417: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882577.82924: no more pending results, returning what we have 29946 1726882577.82926: results queue empty 29946 1726882577.82927: checking for any_errors_fatal 29946 1726882577.82928: done checking for any_errors_fatal 29946 1726882577.82929: checking for max_fail_percentage 29946 1726882577.82931: done checking for max_fail_percentage 29946 1726882577.82931: checking to see if all hosts have failed and the running result is not ok 29946 1726882577.82932: done checking to see if all hosts have failed 29946 1726882577.82933: getting the remaining hosts for this loop 29946 1726882577.82934: done getting the remaining hosts for this loop 29946 1726882577.82937: getting the next task for host managed_node2 29946 1726882577.82942: done getting next task for host managed_node2 29946 1726882577.82944: ^ task is: TASK: meta (flush_handlers) 29946 1726882577.82946: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882577.82949: getting variables 29946 1726882577.82950: in VariableManager get_vars() 29946 1726882577.83091: Calling all_inventory to load vars for managed_node2 29946 1726882577.83179: Calling groups_inventory to load vars for managed_node2 29946 1726882577.83183: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882577.83196: Calling all_plugins_play to load vars for managed_node2 29946 1726882577.83198: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882577.83201: Calling groups_plugins_play to load vars for managed_node2 29946 1726882577.83617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.84115: done with get_vars() 29946 1726882577.84127: done getting variables 29946 1726882577.84344: in VariableManager get_vars() 29946 1726882577.84515: Calling all_inventory to load vars for managed_node2 29946 1726882577.84519: Calling groups_inventory to load vars for managed_node2 29946 1726882577.84521: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882577.84529: Calling all_plugins_play to load vars for managed_node2 29946 1726882577.84532: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882577.84548: Calling groups_plugins_play to load vars for managed_node2 29946 1726882577.85061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.85618: done with get_vars() 29946 1726882577.85633: done queuing things up, now waiting for results queue to drain 29946 1726882577.85642: results queue empty 29946 1726882577.85643: checking for any_errors_fatal 29946 1726882577.85648: done checking for any_errors_fatal 29946 1726882577.85649: checking for max_fail_percentage 29946 1726882577.85656: done checking for max_fail_percentage 29946 1726882577.85657: checking to see if all hosts have failed and the running result is not ok 29946 1726882577.85658: done checking to see if all hosts have failed 29946 1726882577.85662: getting the remaining hosts for this loop 29946 1726882577.85663: done getting the remaining hosts for this loop 29946 1726882577.85665: getting the next task for host managed_node2 29946 1726882577.85671: done getting next task for host managed_node2 29946 1726882577.85674: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 29946 1726882577.85675: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882577.85677: getting variables 29946 1726882577.85679: in VariableManager get_vars() 29946 1726882577.85695: Calling all_inventory to load vars for managed_node2 29946 1726882577.85697: Calling groups_inventory to load vars for managed_node2 29946 1726882577.85702: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882577.85708: Calling all_plugins_play to load vars for managed_node2 29946 1726882577.85711: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882577.85799: Calling groups_plugins_play to load vars for managed_node2 29946 1726882577.86214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.86645: done with get_vars() 29946 1726882577.86653: done getting variables 29946 1726882577.86723: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882577.86865: variable 'type' from source: play vars 29946 1726882577.86871: variable 'interface' from source: play vars TASK [Set type=veth and interface=ethtest0] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:10 Friday 20 September 2024 21:36:17 -0400 (0:00:01.044) 0:00:03.980 ****** 29946 1726882577.87123: entering _queue_task() for managed_node2/set_fact 29946 1726882577.87641: worker is 1 (out of 1 available) 29946 1726882577.88199: exiting _queue_task() for managed_node2/set_fact 29946 1726882577.88210: done queuing things up, now waiting for results queue to drain 29946 1726882577.88212: waiting for pending results... 29946 1726882577.88421: running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 29946 1726882577.88426: in run() - task 12673a56-9f93-95e7-9dfb-00000000000b 29946 1726882577.88441: variable 'ansible_search_path' from source: unknown 29946 1726882577.88482: calling self._execute() 29946 1726882577.88691: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882577.88708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882577.88723: variable 'omit' from source: magic vars 29946 1726882577.89317: variable 'ansible_distribution_major_version' from source: facts 29946 1726882577.89336: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882577.89348: variable 'omit' from source: magic vars 29946 1726882577.89370: variable 'omit' from source: magic vars 29946 1726882577.89421: variable 'type' from source: play vars 29946 1726882577.89524: variable 'type' from source: play vars 29946 1726882577.89538: variable 'interface' from source: play vars 29946 1726882577.89716: variable 'interface' from source: play vars 29946 1726882577.89720: variable 'omit' from source: magic vars 29946 1726882577.89722: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882577.89725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882577.89756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882577.89781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882577.89808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882577.89884: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882577.89901: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882577.89911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882577.90051: Set connection var ansible_pipelining to False 29946 1726882577.90069: Set connection var ansible_shell_executable to /bin/sh 29946 1726882577.90281: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882577.90284: Set connection var ansible_timeout to 10 29946 1726882577.90289: Set connection var ansible_shell_type to sh 29946 1726882577.90294: Set connection var ansible_connection to ssh 29946 1726882577.90300: variable 'ansible_shell_executable' from source: unknown 29946 1726882577.90306: variable 'ansible_connection' from source: unknown 29946 1726882577.90308: variable 'ansible_module_compression' from source: unknown 29946 1726882577.90310: variable 'ansible_shell_type' from source: unknown 29946 1726882577.90312: variable 'ansible_shell_executable' from source: unknown 29946 1726882577.90314: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882577.90316: variable 'ansible_pipelining' from source: unknown 29946 1726882577.90318: variable 'ansible_timeout' from source: unknown 29946 1726882577.90320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882577.90504: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882577.90508: variable 'omit' from source: magic vars 29946 1726882577.90511: starting attempt loop 29946 1726882577.90513: running the handler 29946 1726882577.90516: handler run complete 29946 1726882577.90518: attempt loop complete, returning result 29946 1726882577.90519: _execute() done 29946 1726882577.90522: dumping result to json 29946 1726882577.90524: done dumping result, returning 29946 1726882577.90567: done running TaskExecutor() for managed_node2/TASK: Set type=veth and interface=ethtest0 [12673a56-9f93-95e7-9dfb-00000000000b] 29946 1726882577.90571: sending task result for task 12673a56-9f93-95e7-9dfb-00000000000b ok: [managed_node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 29946 1726882577.90843: no more pending results, returning what we have 29946 1726882577.90847: results queue empty 29946 1726882577.90848: checking for any_errors_fatal 29946 1726882577.90851: done checking for any_errors_fatal 29946 1726882577.90851: checking for max_fail_percentage 29946 1726882577.90853: done checking for max_fail_percentage 29946 1726882577.90854: checking to see if all hosts have failed and the running result is not ok 29946 1726882577.90855: done checking to see if all hosts have failed 29946 1726882577.90855: getting the remaining hosts for this loop 29946 1726882577.90857: done getting the remaining hosts for this loop 29946 1726882577.90861: getting the next task for host managed_node2 29946 1726882577.90869: done getting next task for host managed_node2 29946 1726882577.90875: ^ task is: TASK: Include the task 'show_interfaces.yml' 29946 1726882577.90877: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882577.90881: getting variables 29946 1726882577.90883: in VariableManager get_vars() 29946 1726882577.90932: Calling all_inventory to load vars for managed_node2 29946 1726882577.90934: Calling groups_inventory to load vars for managed_node2 29946 1726882577.90937: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882577.90949: Calling all_plugins_play to load vars for managed_node2 29946 1726882577.90952: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882577.90955: Calling groups_plugins_play to load vars for managed_node2 29946 1726882577.91901: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000000b 29946 1726882577.91905: WORKER PROCESS EXITING 29946 1726882577.91959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.92743: done with get_vars() 29946 1726882577.92755: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:14 Friday 20 September 2024 21:36:17 -0400 (0:00:00.061) 0:00:04.042 ****** 29946 1726882577.93246: entering _queue_task() for managed_node2/include_tasks 29946 1726882577.93990: worker is 1 (out of 1 available) 29946 1726882577.94173: exiting _queue_task() for managed_node2/include_tasks 29946 1726882577.94185: done queuing things up, now waiting for results queue to drain 29946 1726882577.94186: waiting for pending results... 29946 1726882577.94479: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 29946 1726882577.94701: in run() - task 12673a56-9f93-95e7-9dfb-00000000000c 29946 1726882577.94721: variable 'ansible_search_path' from source: unknown 29946 1726882577.94764: calling self._execute() 29946 1726882577.94889: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882577.95028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882577.95242: variable 'omit' from source: magic vars 29946 1726882577.96213: variable 'ansible_distribution_major_version' from source: facts 29946 1726882577.96217: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882577.96224: _execute() done 29946 1726882577.96227: dumping result to json 29946 1726882577.96229: done dumping result, returning 29946 1726882577.96231: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-95e7-9dfb-00000000000c] 29946 1726882577.96233: sending task result for task 12673a56-9f93-95e7-9dfb-00000000000c 29946 1726882577.96333: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000000c 29946 1726882577.96336: WORKER PROCESS EXITING 29946 1726882577.96363: no more pending results, returning what we have 29946 1726882577.96368: in VariableManager get_vars() 29946 1726882577.96505: Calling all_inventory to load vars for managed_node2 29946 1726882577.96508: Calling groups_inventory to load vars for managed_node2 29946 1726882577.96510: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882577.96529: Calling all_plugins_play to load vars for managed_node2 29946 1726882577.96533: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882577.96536: Calling groups_plugins_play to load vars for managed_node2 29946 1726882577.96870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882577.97529: done with get_vars() 29946 1726882577.97637: variable 'ansible_search_path' from source: unknown 29946 1726882577.97656: we have included files to process 29946 1726882577.97657: generating all_blocks data 29946 1726882577.97658: done generating all_blocks data 29946 1726882577.97659: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882577.97660: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882577.97662: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882577.98112: in VariableManager get_vars() 29946 1726882577.98129: done with get_vars() 29946 1726882577.98343: done processing included file 29946 1726882577.98345: iterating over new_blocks loaded from include file 29946 1726882577.98347: in VariableManager get_vars() 29946 1726882577.98360: done with get_vars() 29946 1726882577.98361: filtering new block on tags 29946 1726882577.98543: done filtering new block on tags 29946 1726882577.98545: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 29946 1726882577.98550: extending task lists for all hosts with included blocks 29946 1726882578.01019: done extending task lists 29946 1726882578.01021: done processing included files 29946 1726882578.01021: results queue empty 29946 1726882578.01022: checking for any_errors_fatal 29946 1726882578.01026: done checking for any_errors_fatal 29946 1726882578.01027: checking for max_fail_percentage 29946 1726882578.01028: done checking for max_fail_percentage 29946 1726882578.01033: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.01033: done checking to see if all hosts have failed 29946 1726882578.01034: getting the remaining hosts for this loop 29946 1726882578.01035: done getting the remaining hosts for this loop 29946 1726882578.01038: getting the next task for host managed_node2 29946 1726882578.01042: done getting next task for host managed_node2 29946 1726882578.01044: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 29946 1726882578.01047: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.01049: getting variables 29946 1726882578.01050: in VariableManager get_vars() 29946 1726882578.01062: Calling all_inventory to load vars for managed_node2 29946 1726882578.01064: Calling groups_inventory to load vars for managed_node2 29946 1726882578.01066: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.01072: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.01074: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.01077: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.01241: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.01546: done with get_vars() 29946 1726882578.01560: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:18 -0400 (0:00:00.084) 0:00:04.126 ****** 29946 1726882578.01650: entering _queue_task() for managed_node2/include_tasks 29946 1726882578.02074: worker is 1 (out of 1 available) 29946 1726882578.02092: exiting _queue_task() for managed_node2/include_tasks 29946 1726882578.02324: done queuing things up, now waiting for results queue to drain 29946 1726882578.02325: waiting for pending results... 29946 1726882578.02568: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 29946 1726882578.02659: in run() - task 12673a56-9f93-95e7-9dfb-000000000121 29946 1726882578.02735: variable 'ansible_search_path' from source: unknown 29946 1726882578.02739: variable 'ansible_search_path' from source: unknown 29946 1726882578.02742: calling self._execute() 29946 1726882578.02779: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.02789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.02797: variable 'omit' from source: magic vars 29946 1726882578.03333: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.03438: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.03474: _execute() done 29946 1726882578.03515: dumping result to json 29946 1726882578.03675: done dumping result, returning 29946 1726882578.03679: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-95e7-9dfb-000000000121] 29946 1726882578.03685: sending task result for task 12673a56-9f93-95e7-9dfb-000000000121 29946 1726882578.03775: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000121 29946 1726882578.03778: WORKER PROCESS EXITING 29946 1726882578.03844: no more pending results, returning what we have 29946 1726882578.03851: in VariableManager get_vars() 29946 1726882578.04104: Calling all_inventory to load vars for managed_node2 29946 1726882578.04107: Calling groups_inventory to load vars for managed_node2 29946 1726882578.04110: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.04119: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.04121: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.04124: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.04392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.04633: done with get_vars() 29946 1726882578.04641: variable 'ansible_search_path' from source: unknown 29946 1726882578.04642: variable 'ansible_search_path' from source: unknown 29946 1726882578.04681: we have included files to process 29946 1726882578.04682: generating all_blocks data 29946 1726882578.04684: done generating all_blocks data 29946 1726882578.04684: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882578.04688: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882578.04691: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882578.05079: done processing included file 29946 1726882578.05081: iterating over new_blocks loaded from include file 29946 1726882578.05083: in VariableManager get_vars() 29946 1726882578.05110: done with get_vars() 29946 1726882578.05111: filtering new block on tags 29946 1726882578.05128: done filtering new block on tags 29946 1726882578.05130: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 29946 1726882578.05134: extending task lists for all hosts with included blocks 29946 1726882578.05275: done extending task lists 29946 1726882578.05276: done processing included files 29946 1726882578.05277: results queue empty 29946 1726882578.05278: checking for any_errors_fatal 29946 1726882578.05281: done checking for any_errors_fatal 29946 1726882578.05282: checking for max_fail_percentage 29946 1726882578.05283: done checking for max_fail_percentage 29946 1726882578.05283: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.05284: done checking to see if all hosts have failed 29946 1726882578.05285: getting the remaining hosts for this loop 29946 1726882578.05288: done getting the remaining hosts for this loop 29946 1726882578.05291: getting the next task for host managed_node2 29946 1726882578.05298: done getting next task for host managed_node2 29946 1726882578.05300: ^ task is: TASK: Gather current interface info 29946 1726882578.05303: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.05305: getting variables 29946 1726882578.05306: in VariableManager get_vars() 29946 1726882578.05317: Calling all_inventory to load vars for managed_node2 29946 1726882578.05319: Calling groups_inventory to load vars for managed_node2 29946 1726882578.05321: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.05325: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.05328: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.05330: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.05748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.06011: done with get_vars() 29946 1726882578.06023: done getting variables 29946 1726882578.06067: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:18 -0400 (0:00:00.044) 0:00:04.170 ****** 29946 1726882578.06098: entering _queue_task() for managed_node2/command 29946 1726882578.06350: worker is 1 (out of 1 available) 29946 1726882578.06472: exiting _queue_task() for managed_node2/command 29946 1726882578.06482: done queuing things up, now waiting for results queue to drain 29946 1726882578.06483: waiting for pending results... 29946 1726882578.06811: running TaskExecutor() for managed_node2/TASK: Gather current interface info 29946 1726882578.06817: in run() - task 12673a56-9f93-95e7-9dfb-0000000001b0 29946 1726882578.06820: variable 'ansible_search_path' from source: unknown 29946 1726882578.06822: variable 'ansible_search_path' from source: unknown 29946 1726882578.06825: calling self._execute() 29946 1726882578.06898: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.06910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.06922: variable 'omit' from source: magic vars 29946 1726882578.07303: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.07321: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.07337: variable 'omit' from source: magic vars 29946 1726882578.07383: variable 'omit' from source: magic vars 29946 1726882578.07442: variable 'omit' from source: magic vars 29946 1726882578.07463: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882578.07506: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882578.07527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882578.07662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.07665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.07668: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882578.07674: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.07677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.07792: Set connection var ansible_pipelining to False 29946 1726882578.07797: Set connection var ansible_shell_executable to /bin/sh 29946 1726882578.07803: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882578.07809: Set connection var ansible_timeout to 10 29946 1726882578.07816: Set connection var ansible_shell_type to sh 29946 1726882578.07818: Set connection var ansible_connection to ssh 29946 1726882578.07841: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.07844: variable 'ansible_connection' from source: unknown 29946 1726882578.07846: variable 'ansible_module_compression' from source: unknown 29946 1726882578.07849: variable 'ansible_shell_type' from source: unknown 29946 1726882578.07851: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.07853: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.07855: variable 'ansible_pipelining' from source: unknown 29946 1726882578.07899: variable 'ansible_timeout' from source: unknown 29946 1726882578.07903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.08064: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882578.08068: variable 'omit' from source: magic vars 29946 1726882578.08071: starting attempt loop 29946 1726882578.08073: running the handler 29946 1726882578.08075: _low_level_execute_command(): starting 29946 1726882578.08077: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882578.09690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882578.09696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882578.09700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882578.09865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882578.09869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882578.09891: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.09916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.10061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.11846: stdout chunk (state=3): >>>/root <<< 29946 1726882578.11872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.11899: stderr chunk (state=3): >>><<< 29946 1726882578.12115: stdout chunk (state=3): >>><<< 29946 1726882578.12141: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882578.12153: _low_level_execute_command(): starting 29946 1726882578.12158: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542 `" && echo ansible-tmp-1726882578.1213994-30130-83974515486542="` echo /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542 `" ) && sleep 0' 29946 1726882578.13399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882578.13554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882578.13577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.13592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.13801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.15678: stdout chunk (state=3): >>>ansible-tmp-1726882578.1213994-30130-83974515486542=/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542 <<< 29946 1726882578.15854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.15859: stdout chunk (state=3): >>><<< 29946 1726882578.15865: stderr chunk (state=3): >>><<< 29946 1726882578.15973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882578.1213994-30130-83974515486542=/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882578.16098: variable 'ansible_module_compression' from source: unknown 29946 1726882578.16102: ANSIBALLZ: Using generic lock for ansible.legacy.command 29946 1726882578.16104: ANSIBALLZ: Acquiring lock 29946 1726882578.16106: ANSIBALLZ: Lock acquired: 140626579263984 29946 1726882578.16202: ANSIBALLZ: Creating module 29946 1726882578.36188: ANSIBALLZ: Writing module into payload 29946 1726882578.36385: ANSIBALLZ: Writing module 29946 1726882578.36611: ANSIBALLZ: Renaming module 29946 1726882578.36617: ANSIBALLZ: Done creating module 29946 1726882578.36635: variable 'ansible_facts' from source: unknown 29946 1726882578.36717: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py 29946 1726882578.37024: Sending initial data 29946 1726882578.37027: Sent initial data (155 bytes) 29946 1726882578.37743: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882578.37816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882578.37867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882578.37888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.37929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.37995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.39796: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882578.39799: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882578.40014: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpdff3xdrp /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py <<< 29946 1726882578.40018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py" <<< 29946 1726882578.40078: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpdff3xdrp" to remote "/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py" <<< 29946 1726882578.40996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.40999: stdout chunk (state=3): >>><<< 29946 1726882578.41001: stderr chunk (state=3): >>><<< 29946 1726882578.41128: done transferring module to remote 29946 1726882578.41131: _low_level_execute_command(): starting 29946 1726882578.41134: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/ /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py && sleep 0' 29946 1726882578.41680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882578.41699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882578.41713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882578.41737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882578.41753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882578.41807: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882578.41866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882578.41884: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.41910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.42004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.43882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.43896: stdout chunk (state=3): >>><<< 29946 1726882578.43903: stderr chunk (state=3): >>><<< 29946 1726882578.43920: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882578.43923: _low_level_execute_command(): starting 29946 1726882578.43928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/AnsiballZ_command.py && sleep 0' 29946 1726882578.44514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882578.44609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882578.44657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882578.44680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.44701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.44826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.60280: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:18.596959", "end": "2024-09-20 21:36:18.600209", "delta": "0:00:00.003250", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882578.61551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.61678: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 29946 1726882578.61681: stdout chunk (state=3): >>><<< 29946 1726882578.61684: stderr chunk (state=3): >>><<< 29946 1726882578.61707: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:18.596959", "end": "2024-09-20 21:36:18.600209", "delta": "0:00:00.003250", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882578.61746: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882578.61983: _low_level_execute_command(): starting 29946 1726882578.61988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882578.1213994-30130-83974515486542/ > /dev/null 2>&1 && sleep 0' 29946 1726882578.63169: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882578.63233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882578.63253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882578.63356: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882578.63708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882578.63897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882578.65762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882578.65765: stdout chunk (state=3): >>><<< 29946 1726882578.65768: stderr chunk (state=3): >>><<< 29946 1726882578.65785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882578.65802: handler run complete 29946 1726882578.65867: Evaluated conditional (False): False 29946 1726882578.65883: attempt loop complete, returning result 29946 1726882578.65904: _execute() done 29946 1726882578.66101: dumping result to json 29946 1726882578.66105: done dumping result, returning 29946 1726882578.66107: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-95e7-9dfb-0000000001b0] 29946 1726882578.66109: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001b0 29946 1726882578.66323: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001b0 29946 1726882578.66326: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003250", "end": "2024-09-20 21:36:18.600209", "rc": 0, "start": "2024-09-20 21:36:18.596959" } STDOUT: bonding_masters eth0 lo rpltstbr 29946 1726882578.66410: no more pending results, returning what we have 29946 1726882578.66415: results queue empty 29946 1726882578.66416: checking for any_errors_fatal 29946 1726882578.66418: done checking for any_errors_fatal 29946 1726882578.66418: checking for max_fail_percentage 29946 1726882578.66420: done checking for max_fail_percentage 29946 1726882578.66421: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.66422: done checking to see if all hosts have failed 29946 1726882578.66423: getting the remaining hosts for this loop 29946 1726882578.66424: done getting the remaining hosts for this loop 29946 1726882578.66427: getting the next task for host managed_node2 29946 1726882578.66434: done getting next task for host managed_node2 29946 1726882578.66499: ^ task is: TASK: Set current_interfaces 29946 1726882578.66504: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.66510: getting variables 29946 1726882578.66511: in VariableManager get_vars() 29946 1726882578.66809: Calling all_inventory to load vars for managed_node2 29946 1726882578.66812: Calling groups_inventory to load vars for managed_node2 29946 1726882578.66815: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.66826: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.66830: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.66833: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.67389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.67859: done with get_vars() 29946 1726882578.67872: done getting variables 29946 1726882578.67982: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:18 -0400 (0:00:00.619) 0:00:04.789 ****** 29946 1726882578.68016: entering _queue_task() for managed_node2/set_fact 29946 1726882578.68425: worker is 1 (out of 1 available) 29946 1726882578.68437: exiting _queue_task() for managed_node2/set_fact 29946 1726882578.68449: done queuing things up, now waiting for results queue to drain 29946 1726882578.68450: waiting for pending results... 29946 1726882578.68714: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 29946 1726882578.68900: in run() - task 12673a56-9f93-95e7-9dfb-0000000001b1 29946 1726882578.68904: variable 'ansible_search_path' from source: unknown 29946 1726882578.68907: variable 'ansible_search_path' from source: unknown 29946 1726882578.68909: calling self._execute() 29946 1726882578.69003: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.69016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.69037: variable 'omit' from source: magic vars 29946 1726882578.69325: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.69335: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.69341: variable 'omit' from source: magic vars 29946 1726882578.69378: variable 'omit' from source: magic vars 29946 1726882578.69455: variable '_current_interfaces' from source: set_fact 29946 1726882578.69506: variable 'omit' from source: magic vars 29946 1726882578.69538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882578.69564: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882578.69582: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882578.69604: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.69614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.69637: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882578.69640: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.69642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.69721: Set connection var ansible_pipelining to False 29946 1726882578.69725: Set connection var ansible_shell_executable to /bin/sh 29946 1726882578.69730: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882578.69735: Set connection var ansible_timeout to 10 29946 1726882578.69741: Set connection var ansible_shell_type to sh 29946 1726882578.69744: Set connection var ansible_connection to ssh 29946 1726882578.69761: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.69764: variable 'ansible_connection' from source: unknown 29946 1726882578.69767: variable 'ansible_module_compression' from source: unknown 29946 1726882578.69769: variable 'ansible_shell_type' from source: unknown 29946 1726882578.69772: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.69774: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.69776: variable 'ansible_pipelining' from source: unknown 29946 1726882578.69778: variable 'ansible_timeout' from source: unknown 29946 1726882578.69784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.69884: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882578.69896: variable 'omit' from source: magic vars 29946 1726882578.70000: starting attempt loop 29946 1726882578.70003: running the handler 29946 1726882578.70006: handler run complete 29946 1726882578.70008: attempt loop complete, returning result 29946 1726882578.70010: _execute() done 29946 1726882578.70013: dumping result to json 29946 1726882578.70016: done dumping result, returning 29946 1726882578.70032: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-95e7-9dfb-0000000001b1] 29946 1726882578.70035: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001b1 29946 1726882578.70091: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001b1 29946 1726882578.70096: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 29946 1726882578.70150: no more pending results, returning what we have 29946 1726882578.70153: results queue empty 29946 1726882578.70154: checking for any_errors_fatal 29946 1726882578.70163: done checking for any_errors_fatal 29946 1726882578.70164: checking for max_fail_percentage 29946 1726882578.70166: done checking for max_fail_percentage 29946 1726882578.70166: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.70167: done checking to see if all hosts have failed 29946 1726882578.70168: getting the remaining hosts for this loop 29946 1726882578.70169: done getting the remaining hosts for this loop 29946 1726882578.70173: getting the next task for host managed_node2 29946 1726882578.70179: done getting next task for host managed_node2 29946 1726882578.70181: ^ task is: TASK: Show current_interfaces 29946 1726882578.70184: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.70188: getting variables 29946 1726882578.70189: in VariableManager get_vars() 29946 1726882578.70222: Calling all_inventory to load vars for managed_node2 29946 1726882578.70224: Calling groups_inventory to load vars for managed_node2 29946 1726882578.70226: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.70235: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.70237: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.70239: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.70463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.71036: done with get_vars() 29946 1726882578.71047: done getting variables 29946 1726882578.71147: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:18 -0400 (0:00:00.031) 0:00:04.821 ****** 29946 1726882578.71176: entering _queue_task() for managed_node2/debug 29946 1726882578.71178: Creating lock for debug 29946 1726882578.71778: worker is 1 (out of 1 available) 29946 1726882578.71796: exiting _queue_task() for managed_node2/debug 29946 1726882578.71806: done queuing things up, now waiting for results queue to drain 29946 1726882578.71807: waiting for pending results... 29946 1726882578.72310: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 29946 1726882578.72411: in run() - task 12673a56-9f93-95e7-9dfb-000000000122 29946 1726882578.72513: variable 'ansible_search_path' from source: unknown 29946 1726882578.72519: variable 'ansible_search_path' from source: unknown 29946 1726882578.72523: calling self._execute() 29946 1726882578.72587: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.72597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.72607: variable 'omit' from source: magic vars 29946 1726882578.72896: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.72907: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.72912: variable 'omit' from source: magic vars 29946 1726882578.72939: variable 'omit' from source: magic vars 29946 1726882578.73011: variable 'current_interfaces' from source: set_fact 29946 1726882578.73032: variable 'omit' from source: magic vars 29946 1726882578.73065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882578.73095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882578.73112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882578.73125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.73135: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882578.73161: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882578.73164: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.73166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.73240: Set connection var ansible_pipelining to False 29946 1726882578.73244: Set connection var ansible_shell_executable to /bin/sh 29946 1726882578.73249: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882578.73254: Set connection var ansible_timeout to 10 29946 1726882578.73261: Set connection var ansible_shell_type to sh 29946 1726882578.73263: Set connection var ansible_connection to ssh 29946 1726882578.73285: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.73291: variable 'ansible_connection' from source: unknown 29946 1726882578.73296: variable 'ansible_module_compression' from source: unknown 29946 1726882578.73299: variable 'ansible_shell_type' from source: unknown 29946 1726882578.73301: variable 'ansible_shell_executable' from source: unknown 29946 1726882578.73303: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.73305: variable 'ansible_pipelining' from source: unknown 29946 1726882578.73308: variable 'ansible_timeout' from source: unknown 29946 1726882578.73310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.73400: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882578.73409: variable 'omit' from source: magic vars 29946 1726882578.73413: starting attempt loop 29946 1726882578.73416: running the handler 29946 1726882578.73451: handler run complete 29946 1726882578.73461: attempt loop complete, returning result 29946 1726882578.73464: _execute() done 29946 1726882578.73466: dumping result to json 29946 1726882578.73469: done dumping result, returning 29946 1726882578.73476: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-95e7-9dfb-000000000122] 29946 1726882578.73478: sending task result for task 12673a56-9f93-95e7-9dfb-000000000122 29946 1726882578.73560: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000122 29946 1726882578.73562: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 29946 1726882578.73611: no more pending results, returning what we have 29946 1726882578.73615: results queue empty 29946 1726882578.73616: checking for any_errors_fatal 29946 1726882578.73621: done checking for any_errors_fatal 29946 1726882578.73621: checking for max_fail_percentage 29946 1726882578.73623: done checking for max_fail_percentage 29946 1726882578.73623: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.73624: done checking to see if all hosts have failed 29946 1726882578.73625: getting the remaining hosts for this loop 29946 1726882578.73626: done getting the remaining hosts for this loop 29946 1726882578.73630: getting the next task for host managed_node2 29946 1726882578.73637: done getting next task for host managed_node2 29946 1726882578.73640: ^ task is: TASK: Include the task 'manage_test_interface.yml' 29946 1726882578.73642: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.73645: getting variables 29946 1726882578.73646: in VariableManager get_vars() 29946 1726882578.73679: Calling all_inventory to load vars for managed_node2 29946 1726882578.73682: Calling groups_inventory to load vars for managed_node2 29946 1726882578.73684: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.73698: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.73701: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.73704: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.73881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.74008: done with get_vars() 29946 1726882578.74017: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:16 Friday 20 September 2024 21:36:18 -0400 (0:00:00.028) 0:00:04.850 ****** 29946 1726882578.74075: entering _queue_task() for managed_node2/include_tasks 29946 1726882578.74276: worker is 1 (out of 1 available) 29946 1726882578.74288: exiting _queue_task() for managed_node2/include_tasks 29946 1726882578.74303: done queuing things up, now waiting for results queue to drain 29946 1726882578.74305: waiting for pending results... 29946 1726882578.74553: running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' 29946 1726882578.74699: in run() - task 12673a56-9f93-95e7-9dfb-00000000000d 29946 1726882578.74703: variable 'ansible_search_path' from source: unknown 29946 1726882578.74706: calling self._execute() 29946 1726882578.74772: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.74782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.74802: variable 'omit' from source: magic vars 29946 1726882578.75180: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.75202: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.75215: _execute() done 29946 1726882578.75229: dumping result to json 29946 1726882578.75297: done dumping result, returning 29946 1726882578.75301: done running TaskExecutor() for managed_node2/TASK: Include the task 'manage_test_interface.yml' [12673a56-9f93-95e7-9dfb-00000000000d] 29946 1726882578.75303: sending task result for task 12673a56-9f93-95e7-9dfb-00000000000d 29946 1726882578.75364: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000000d 29946 1726882578.75366: WORKER PROCESS EXITING 29946 1726882578.75423: no more pending results, returning what we have 29946 1726882578.75428: in VariableManager get_vars() 29946 1726882578.75469: Calling all_inventory to load vars for managed_node2 29946 1726882578.75472: Calling groups_inventory to load vars for managed_node2 29946 1726882578.75474: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.75486: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.75488: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.75491: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.76237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.76738: done with get_vars() 29946 1726882578.76746: variable 'ansible_search_path' from source: unknown 29946 1726882578.76845: we have included files to process 29946 1726882578.76846: generating all_blocks data 29946 1726882578.76848: done generating all_blocks data 29946 1726882578.76851: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29946 1726882578.76853: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29946 1726882578.76855: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 29946 1726882578.77535: in VariableManager get_vars() 29946 1726882578.77554: done with get_vars() 29946 1726882578.77770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 29946 1726882578.78356: done processing included file 29946 1726882578.78597: iterating over new_blocks loaded from include file 29946 1726882578.78599: in VariableManager get_vars() 29946 1726882578.78615: done with get_vars() 29946 1726882578.78616: filtering new block on tags 29946 1726882578.78646: done filtering new block on tags 29946 1726882578.78648: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed_node2 29946 1726882578.78653: extending task lists for all hosts with included blocks 29946 1726882578.81346: done extending task lists 29946 1726882578.81348: done processing included files 29946 1726882578.81349: results queue empty 29946 1726882578.81349: checking for any_errors_fatal 29946 1726882578.81352: done checking for any_errors_fatal 29946 1726882578.81353: checking for max_fail_percentage 29946 1726882578.81354: done checking for max_fail_percentage 29946 1726882578.81355: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.81356: done checking to see if all hosts have failed 29946 1726882578.81356: getting the remaining hosts for this loop 29946 1726882578.81358: done getting the remaining hosts for this loop 29946 1726882578.81360: getting the next task for host managed_node2 29946 1726882578.81364: done getting next task for host managed_node2 29946 1726882578.81366: ^ task is: TASK: Ensure state in ["present", "absent"] 29946 1726882578.81369: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.81371: getting variables 29946 1726882578.81372: in VariableManager get_vars() 29946 1726882578.81389: Calling all_inventory to load vars for managed_node2 29946 1726882578.81392: Calling groups_inventory to load vars for managed_node2 29946 1726882578.81397: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.81403: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.81405: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.81408: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.81583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.82252: done with get_vars() 29946 1726882578.82262: done getting variables 29946 1726882578.82336: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Friday 20 September 2024 21:36:18 -0400 (0:00:00.082) 0:00:04.933 ****** 29946 1726882578.82362: entering _queue_task() for managed_node2/fail 29946 1726882578.82364: Creating lock for fail 29946 1726882578.83197: worker is 1 (out of 1 available) 29946 1726882578.83207: exiting _queue_task() for managed_node2/fail 29946 1726882578.83218: done queuing things up, now waiting for results queue to drain 29946 1726882578.83220: waiting for pending results... 29946 1726882578.83839: running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] 29946 1726882578.83844: in run() - task 12673a56-9f93-95e7-9dfb-0000000001cc 29946 1726882578.83848: variable 'ansible_search_path' from source: unknown 29946 1726882578.83851: variable 'ansible_search_path' from source: unknown 29946 1726882578.83919: calling self._execute() 29946 1726882578.84151: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.84155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.84158: variable 'omit' from source: magic vars 29946 1726882578.84954: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.84957: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.85165: variable 'state' from source: include params 29946 1726882578.85499: Evaluated conditional (state not in ["present", "absent"]): False 29946 1726882578.85502: when evaluation is False, skipping this task 29946 1726882578.85505: _execute() done 29946 1726882578.85507: dumping result to json 29946 1726882578.85509: done dumping result, returning 29946 1726882578.85513: done running TaskExecutor() for managed_node2/TASK: Ensure state in ["present", "absent"] [12673a56-9f93-95e7-9dfb-0000000001cc] 29946 1726882578.85515: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cc 29946 1726882578.85580: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cc 29946 1726882578.85583: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 29946 1726882578.85635: no more pending results, returning what we have 29946 1726882578.85640: results queue empty 29946 1726882578.85641: checking for any_errors_fatal 29946 1726882578.85643: done checking for any_errors_fatal 29946 1726882578.85643: checking for max_fail_percentage 29946 1726882578.85645: done checking for max_fail_percentage 29946 1726882578.85646: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.85647: done checking to see if all hosts have failed 29946 1726882578.85648: getting the remaining hosts for this loop 29946 1726882578.85649: done getting the remaining hosts for this loop 29946 1726882578.85653: getting the next task for host managed_node2 29946 1726882578.85660: done getting next task for host managed_node2 29946 1726882578.85662: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 29946 1726882578.85666: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.85670: getting variables 29946 1726882578.85672: in VariableManager get_vars() 29946 1726882578.85716: Calling all_inventory to load vars for managed_node2 29946 1726882578.85719: Calling groups_inventory to load vars for managed_node2 29946 1726882578.85725: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.85742: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.85746: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.85749: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.86383: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.86932: done with get_vars() 29946 1726882578.86943: done getting variables 29946 1726882578.87100: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Friday 20 September 2024 21:36:18 -0400 (0:00:00.047) 0:00:04.980 ****** 29946 1726882578.87128: entering _queue_task() for managed_node2/fail 29946 1726882578.87585: worker is 1 (out of 1 available) 29946 1726882578.87801: exiting _queue_task() for managed_node2/fail 29946 1726882578.87811: done queuing things up, now waiting for results queue to drain 29946 1726882578.87813: waiting for pending results... 29946 1726882578.88166: running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] 29946 1726882578.88361: in run() - task 12673a56-9f93-95e7-9dfb-0000000001cd 29946 1726882578.88365: variable 'ansible_search_path' from source: unknown 29946 1726882578.88368: variable 'ansible_search_path' from source: unknown 29946 1726882578.88424: calling self._execute() 29946 1726882578.88615: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.88622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.88689: variable 'omit' from source: magic vars 29946 1726882578.89397: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.89411: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.89682: variable 'type' from source: set_fact 29946 1726882578.89688: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 29946 1726882578.89691: when evaluation is False, skipping this task 29946 1726882578.89696: _execute() done 29946 1726882578.89698: dumping result to json 29946 1726882578.89733: done dumping result, returning 29946 1726882578.89770: done running TaskExecutor() for managed_node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12673a56-9f93-95e7-9dfb-0000000001cd] 29946 1726882578.89773: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cd 29946 1726882578.90022: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cd 29946 1726882578.90026: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 29946 1726882578.90070: no more pending results, returning what we have 29946 1726882578.90074: results queue empty 29946 1726882578.90075: checking for any_errors_fatal 29946 1726882578.90079: done checking for any_errors_fatal 29946 1726882578.90080: checking for max_fail_percentage 29946 1726882578.90082: done checking for max_fail_percentage 29946 1726882578.90082: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.90083: done checking to see if all hosts have failed 29946 1726882578.90084: getting the remaining hosts for this loop 29946 1726882578.90085: done getting the remaining hosts for this loop 29946 1726882578.90091: getting the next task for host managed_node2 29946 1726882578.90099: done getting next task for host managed_node2 29946 1726882578.90102: ^ task is: TASK: Include the task 'show_interfaces.yml' 29946 1726882578.90105: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.90109: getting variables 29946 1726882578.90110: in VariableManager get_vars() 29946 1726882578.90145: Calling all_inventory to load vars for managed_node2 29946 1726882578.90147: Calling groups_inventory to load vars for managed_node2 29946 1726882578.90149: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.90158: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.90161: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.90164: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.90549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.90966: done with get_vars() 29946 1726882578.90976: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Friday 20 September 2024 21:36:18 -0400 (0:00:00.041) 0:00:05.022 ****** 29946 1726882578.91273: entering _queue_task() for managed_node2/include_tasks 29946 1726882578.91634: worker is 1 (out of 1 available) 29946 1726882578.91646: exiting _queue_task() for managed_node2/include_tasks 29946 1726882578.91658: done queuing things up, now waiting for results queue to drain 29946 1726882578.91660: waiting for pending results... 29946 1726882578.92015: running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' 29946 1726882578.92220: in run() - task 12673a56-9f93-95e7-9dfb-0000000001ce 29946 1726882578.92233: variable 'ansible_search_path' from source: unknown 29946 1726882578.92238: variable 'ansible_search_path' from source: unknown 29946 1726882578.92269: calling self._execute() 29946 1726882578.92462: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.92468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.92477: variable 'omit' from source: magic vars 29946 1726882578.93600: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.93604: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.93606: _execute() done 29946 1726882578.93609: dumping result to json 29946 1726882578.93611: done dumping result, returning 29946 1726882578.93614: done running TaskExecutor() for managed_node2/TASK: Include the task 'show_interfaces.yml' [12673a56-9f93-95e7-9dfb-0000000001ce] 29946 1726882578.93617: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001ce 29946 1726882578.93688: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001ce 29946 1726882578.93694: WORKER PROCESS EXITING 29946 1726882578.93720: no more pending results, returning what we have 29946 1726882578.93725: in VariableManager get_vars() 29946 1726882578.93764: Calling all_inventory to load vars for managed_node2 29946 1726882578.93766: Calling groups_inventory to load vars for managed_node2 29946 1726882578.93768: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.93779: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.93781: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.93784: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.94243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.94689: done with get_vars() 29946 1726882578.94761: variable 'ansible_search_path' from source: unknown 29946 1726882578.94763: variable 'ansible_search_path' from source: unknown 29946 1726882578.94805: we have included files to process 29946 1726882578.94806: generating all_blocks data 29946 1726882578.94808: done generating all_blocks data 29946 1726882578.94814: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882578.94815: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882578.94817: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 29946 1726882578.95035: in VariableManager get_vars() 29946 1726882578.95054: done with get_vars() 29946 1726882578.95281: done processing included file 29946 1726882578.95283: iterating over new_blocks loaded from include file 29946 1726882578.95285: in VariableManager get_vars() 29946 1726882578.95415: done with get_vars() 29946 1726882578.95417: filtering new block on tags 29946 1726882578.95440: done filtering new block on tags 29946 1726882578.95443: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed_node2 29946 1726882578.95447: extending task lists for all hosts with included blocks 29946 1726882578.96385: done extending task lists 29946 1726882578.96387: done processing included files 29946 1726882578.96387: results queue empty 29946 1726882578.96390: checking for any_errors_fatal 29946 1726882578.96424: done checking for any_errors_fatal 29946 1726882578.96425: checking for max_fail_percentage 29946 1726882578.96426: done checking for max_fail_percentage 29946 1726882578.96426: checking to see if all hosts have failed and the running result is not ok 29946 1726882578.96427: done checking to see if all hosts have failed 29946 1726882578.96428: getting the remaining hosts for this loop 29946 1726882578.96429: done getting the remaining hosts for this loop 29946 1726882578.96432: getting the next task for host managed_node2 29946 1726882578.96436: done getting next task for host managed_node2 29946 1726882578.96438: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 29946 1726882578.96441: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882578.96443: getting variables 29946 1726882578.96444: in VariableManager get_vars() 29946 1726882578.96508: Calling all_inventory to load vars for managed_node2 29946 1726882578.96510: Calling groups_inventory to load vars for managed_node2 29946 1726882578.96512: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.96517: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.96519: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.96523: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.96872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.97309: done with get_vars() 29946 1726882578.97318: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Friday 20 September 2024 21:36:18 -0400 (0:00:00.062) 0:00:05.084 ****** 29946 1726882578.97508: entering _queue_task() for managed_node2/include_tasks 29946 1726882578.98015: worker is 1 (out of 1 available) 29946 1726882578.98141: exiting _queue_task() for managed_node2/include_tasks 29946 1726882578.98153: done queuing things up, now waiting for results queue to drain 29946 1726882578.98154: waiting for pending results... 29946 1726882578.98320: running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' 29946 1726882578.98602: in run() - task 12673a56-9f93-95e7-9dfb-000000000275 29946 1726882578.98607: variable 'ansible_search_path' from source: unknown 29946 1726882578.98610: variable 'ansible_search_path' from source: unknown 29946 1726882578.98613: calling self._execute() 29946 1726882578.98616: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882578.98623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882578.98631: variable 'omit' from source: magic vars 29946 1726882578.99018: variable 'ansible_distribution_major_version' from source: facts 29946 1726882578.99029: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882578.99041: _execute() done 29946 1726882578.99044: dumping result to json 29946 1726882578.99046: done dumping result, returning 29946 1726882578.99053: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_current_interfaces.yml' [12673a56-9f93-95e7-9dfb-000000000275] 29946 1726882578.99056: sending task result for task 12673a56-9f93-95e7-9dfb-000000000275 29946 1726882578.99168: no more pending results, returning what we have 29946 1726882578.99173: in VariableManager get_vars() 29946 1726882578.99317: Calling all_inventory to load vars for managed_node2 29946 1726882578.99320: Calling groups_inventory to load vars for managed_node2 29946 1726882578.99322: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882578.99327: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000275 29946 1726882578.99330: WORKER PROCESS EXITING 29946 1726882578.99338: Calling all_plugins_play to load vars for managed_node2 29946 1726882578.99341: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882578.99343: Calling groups_plugins_play to load vars for managed_node2 29946 1726882578.99518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882578.99730: done with get_vars() 29946 1726882578.99737: variable 'ansible_search_path' from source: unknown 29946 1726882578.99739: variable 'ansible_search_path' from source: unknown 29946 1726882578.99796: we have included files to process 29946 1726882578.99797: generating all_blocks data 29946 1726882578.99799: done generating all_blocks data 29946 1726882578.99800: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882578.99801: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882578.99803: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 29946 1726882579.00197: done processing included file 29946 1726882579.00200: iterating over new_blocks loaded from include file 29946 1726882579.00201: in VariableManager get_vars() 29946 1726882579.00217: done with get_vars() 29946 1726882579.00219: filtering new block on tags 29946 1726882579.00236: done filtering new block on tags 29946 1726882579.00238: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed_node2 29946 1726882579.00242: extending task lists for all hosts with included blocks 29946 1726882579.00373: done extending task lists 29946 1726882579.00374: done processing included files 29946 1726882579.00375: results queue empty 29946 1726882579.00375: checking for any_errors_fatal 29946 1726882579.00378: done checking for any_errors_fatal 29946 1726882579.00379: checking for max_fail_percentage 29946 1726882579.00380: done checking for max_fail_percentage 29946 1726882579.00381: checking to see if all hosts have failed and the running result is not ok 29946 1726882579.00381: done checking to see if all hosts have failed 29946 1726882579.00382: getting the remaining hosts for this loop 29946 1726882579.00383: done getting the remaining hosts for this loop 29946 1726882579.00388: getting the next task for host managed_node2 29946 1726882579.00396: done getting next task for host managed_node2 29946 1726882579.00398: ^ task is: TASK: Gather current interface info 29946 1726882579.00401: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882579.00404: getting variables 29946 1726882579.00404: in VariableManager get_vars() 29946 1726882579.00416: Calling all_inventory to load vars for managed_node2 29946 1726882579.00419: Calling groups_inventory to load vars for managed_node2 29946 1726882579.00421: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882579.00425: Calling all_plugins_play to load vars for managed_node2 29946 1726882579.00428: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882579.00431: Calling groups_plugins_play to load vars for managed_node2 29946 1726882579.00617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882579.00835: done with get_vars() 29946 1726882579.00851: done getting variables 29946 1726882579.00891: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Friday 20 September 2024 21:36:19 -0400 (0:00:00.034) 0:00:05.118 ****** 29946 1726882579.00934: entering _queue_task() for managed_node2/command 29946 1726882579.01151: worker is 1 (out of 1 available) 29946 1726882579.01165: exiting _queue_task() for managed_node2/command 29946 1726882579.01174: done queuing things up, now waiting for results queue to drain 29946 1726882579.01176: waiting for pending results... 29946 1726882579.01365: running TaskExecutor() for managed_node2/TASK: Gather current interface info 29946 1726882579.01473: in run() - task 12673a56-9f93-95e7-9dfb-0000000002ac 29946 1726882579.01498: variable 'ansible_search_path' from source: unknown 29946 1726882579.01508: variable 'ansible_search_path' from source: unknown 29946 1726882579.01547: calling self._execute() 29946 1726882579.01638: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.01653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.01665: variable 'omit' from source: magic vars 29946 1726882579.02009: variable 'ansible_distribution_major_version' from source: facts 29946 1726882579.02030: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882579.02042: variable 'omit' from source: magic vars 29946 1726882579.02100: variable 'omit' from source: magic vars 29946 1726882579.02139: variable 'omit' from source: magic vars 29946 1726882579.02181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882579.02227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882579.02255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882579.02284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.02324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.02344: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882579.02346: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.02348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.02416: Set connection var ansible_pipelining to False 29946 1726882579.02430: Set connection var ansible_shell_executable to /bin/sh 29946 1726882579.02433: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882579.02435: Set connection var ansible_timeout to 10 29946 1726882579.02438: Set connection var ansible_shell_type to sh 29946 1726882579.02440: Set connection var ansible_connection to ssh 29946 1726882579.02459: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.02462: variable 'ansible_connection' from source: unknown 29946 1726882579.02465: variable 'ansible_module_compression' from source: unknown 29946 1726882579.02467: variable 'ansible_shell_type' from source: unknown 29946 1726882579.02469: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.02471: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.02474: variable 'ansible_pipelining' from source: unknown 29946 1726882579.02476: variable 'ansible_timeout' from source: unknown 29946 1726882579.02480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.02612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882579.02634: variable 'omit' from source: magic vars 29946 1726882579.02637: starting attempt loop 29946 1726882579.02658: running the handler 29946 1726882579.02662: _low_level_execute_command(): starting 29946 1726882579.02664: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882579.03376: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.03380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.03383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.03385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.03456: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.03463: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.03466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.03542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.05211: stdout chunk (state=3): >>>/root <<< 29946 1726882579.05316: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.05348: stderr chunk (state=3): >>><<< 29946 1726882579.05351: stdout chunk (state=3): >>><<< 29946 1726882579.05364: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.05392: _low_level_execute_command(): starting 29946 1726882579.05398: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667 `" && echo ansible-tmp-1726882579.0536866-30182-143906113932667="` echo /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667 `" ) && sleep 0' 29946 1726882579.05783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.05791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882579.05816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.05829: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.05831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.05868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.05874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.05955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.07818: stdout chunk (state=3): >>>ansible-tmp-1726882579.0536866-30182-143906113932667=/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667 <<< 29946 1726882579.07928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.07957: stderr chunk (state=3): >>><<< 29946 1726882579.07960: stdout chunk (state=3): >>><<< 29946 1726882579.07979: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882579.0536866-30182-143906113932667=/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.08019: variable 'ansible_module_compression' from source: unknown 29946 1726882579.08083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882579.08110: variable 'ansible_facts' from source: unknown 29946 1726882579.08184: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py 29946 1726882579.08351: Sending initial data 29946 1726882579.08355: Sent initial data (156 bytes) 29946 1726882579.08885: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882579.08892: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.08946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.08949: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.09019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.10549: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882579.10555: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882579.10616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882579.10679: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpldcdu34f /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py <<< 29946 1726882579.10687: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py" <<< 29946 1726882579.10755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpldcdu34f" to remote "/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py" <<< 29946 1726882579.11489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.11522: stderr chunk (state=3): >>><<< 29946 1726882579.11525: stdout chunk (state=3): >>><<< 29946 1726882579.11555: done transferring module to remote 29946 1726882579.11563: _low_level_execute_command(): starting 29946 1726882579.11567: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/ /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py && sleep 0' 29946 1726882579.12111: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882579.12115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.12120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.12165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.12172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.12188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.12278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.13999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.14024: stderr chunk (state=3): >>><<< 29946 1726882579.14028: stdout chunk (state=3): >>><<< 29946 1726882579.14040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.14043: _low_level_execute_command(): starting 29946 1726882579.14048: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/AnsiballZ_command.py && sleep 0' 29946 1726882579.14464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.14467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.14469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882579.14471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.14510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.14514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.14594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.30174: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:19.295443", "end": "2024-09-20 21:36:19.298805", "delta": "0:00:00.003362", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882579.31665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882579.31668: stdout chunk (state=3): >>><<< 29946 1726882579.31671: stderr chunk (state=3): >>><<< 29946 1726882579.31673: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-20 21:36:19.295443", "end": "2024-09-20 21:36:19.298805", "delta": "0:00:00.003362", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882579.31675: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882579.31678: _low_level_execute_command(): starting 29946 1726882579.31680: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882579.0536866-30182-143906113932667/ > /dev/null 2>&1 && sleep 0' 29946 1726882579.33174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882579.33204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882579.33310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.33348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.33366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.33400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.33524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.35410: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.35425: stdout chunk (state=3): >>><<< 29946 1726882579.35488: stderr chunk (state=3): >>><<< 29946 1726882579.35510: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.35628: handler run complete 29946 1726882579.35659: Evaluated conditional (False): False 29946 1726882579.35676: attempt loop complete, returning result 29946 1726882579.35683: _execute() done 29946 1726882579.35695: dumping result to json 29946 1726882579.35706: done dumping result, returning 29946 1726882579.35717: done running TaskExecutor() for managed_node2/TASK: Gather current interface info [12673a56-9f93-95e7-9dfb-0000000002ac] 29946 1726882579.35724: sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ac ok: [managed_node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003362", "end": "2024-09-20 21:36:19.298805", "rc": 0, "start": "2024-09-20 21:36:19.295443" } STDOUT: bonding_masters eth0 lo rpltstbr 29946 1726882579.35970: no more pending results, returning what we have 29946 1726882579.35975: results queue empty 29946 1726882579.35975: checking for any_errors_fatal 29946 1726882579.35977: done checking for any_errors_fatal 29946 1726882579.35978: checking for max_fail_percentage 29946 1726882579.35979: done checking for max_fail_percentage 29946 1726882579.35980: checking to see if all hosts have failed and the running result is not ok 29946 1726882579.35981: done checking to see if all hosts have failed 29946 1726882579.35982: getting the remaining hosts for this loop 29946 1726882579.35983: done getting the remaining hosts for this loop 29946 1726882579.35990: getting the next task for host managed_node2 29946 1726882579.35998: done getting next task for host managed_node2 29946 1726882579.36001: ^ task is: TASK: Set current_interfaces 29946 1726882579.36008: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882579.36013: getting variables 29946 1726882579.36015: in VariableManager get_vars() 29946 1726882579.36052: Calling all_inventory to load vars for managed_node2 29946 1726882579.36054: Calling groups_inventory to load vars for managed_node2 29946 1726882579.36057: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882579.36068: Calling all_plugins_play to load vars for managed_node2 29946 1726882579.36071: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882579.36074: Calling groups_plugins_play to load vars for managed_node2 29946 1726882579.36720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882579.37175: done with get_vars() 29946 1726882579.37189: done getting variables 29946 1726882579.37463: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ac 29946 1726882579.37467: WORKER PROCESS EXITING 29946 1726882579.37511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Friday 20 September 2024 21:36:19 -0400 (0:00:00.366) 0:00:05.484 ****** 29946 1726882579.37543: entering _queue_task() for managed_node2/set_fact 29946 1726882579.38106: worker is 1 (out of 1 available) 29946 1726882579.38118: exiting _queue_task() for managed_node2/set_fact 29946 1726882579.38129: done queuing things up, now waiting for results queue to drain 29946 1726882579.38131: waiting for pending results... 29946 1726882579.38682: running TaskExecutor() for managed_node2/TASK: Set current_interfaces 29946 1726882579.38839: in run() - task 12673a56-9f93-95e7-9dfb-0000000002ad 29946 1726882579.38895: variable 'ansible_search_path' from source: unknown 29946 1726882579.38905: variable 'ansible_search_path' from source: unknown 29946 1726882579.38944: calling self._execute() 29946 1726882579.39037: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.39050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.39063: variable 'omit' from source: magic vars 29946 1726882579.39435: variable 'ansible_distribution_major_version' from source: facts 29946 1726882579.39457: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882579.39468: variable 'omit' from source: magic vars 29946 1726882579.39526: variable 'omit' from source: magic vars 29946 1726882579.39638: variable '_current_interfaces' from source: set_fact 29946 1726882579.39711: variable 'omit' from source: magic vars 29946 1726882579.39755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882579.39804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882579.39827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882579.39847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.39860: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.39901: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882579.39910: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.39918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.40029: Set connection var ansible_pipelining to False 29946 1726882579.40039: Set connection var ansible_shell_executable to /bin/sh 29946 1726882579.40048: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882579.40056: Set connection var ansible_timeout to 10 29946 1726882579.40066: Set connection var ansible_shell_type to sh 29946 1726882579.40072: Set connection var ansible_connection to ssh 29946 1726882579.40106: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.40116: variable 'ansible_connection' from source: unknown 29946 1726882579.40123: variable 'ansible_module_compression' from source: unknown 29946 1726882579.40130: variable 'ansible_shell_type' from source: unknown 29946 1726882579.40137: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.40144: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.40151: variable 'ansible_pipelining' from source: unknown 29946 1726882579.40158: variable 'ansible_timeout' from source: unknown 29946 1726882579.40166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.40309: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882579.40399: variable 'omit' from source: magic vars 29946 1726882579.40402: starting attempt loop 29946 1726882579.40404: running the handler 29946 1726882579.40407: handler run complete 29946 1726882579.40409: attempt loop complete, returning result 29946 1726882579.40410: _execute() done 29946 1726882579.40412: dumping result to json 29946 1726882579.40415: done dumping result, returning 29946 1726882579.40417: done running TaskExecutor() for managed_node2/TASK: Set current_interfaces [12673a56-9f93-95e7-9dfb-0000000002ad] 29946 1726882579.40419: sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ad ok: [managed_node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 29946 1726882579.40601: no more pending results, returning what we have 29946 1726882579.40606: results queue empty 29946 1726882579.40607: checking for any_errors_fatal 29946 1726882579.40619: done checking for any_errors_fatal 29946 1726882579.40620: checking for max_fail_percentage 29946 1726882579.40622: done checking for max_fail_percentage 29946 1726882579.40622: checking to see if all hosts have failed and the running result is not ok 29946 1726882579.40623: done checking to see if all hosts have failed 29946 1726882579.40624: getting the remaining hosts for this loop 29946 1726882579.40625: done getting the remaining hosts for this loop 29946 1726882579.40629: getting the next task for host managed_node2 29946 1726882579.40637: done getting next task for host managed_node2 29946 1726882579.40640: ^ task is: TASK: Show current_interfaces 29946 1726882579.40645: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882579.40649: getting variables 29946 1726882579.40651: in VariableManager get_vars() 29946 1726882579.40689: Calling all_inventory to load vars for managed_node2 29946 1726882579.40692: Calling groups_inventory to load vars for managed_node2 29946 1726882579.41073: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882579.41079: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ad 29946 1726882579.41083: WORKER PROCESS EXITING 29946 1726882579.41096: Calling all_plugins_play to load vars for managed_node2 29946 1726882579.41099: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882579.41103: Calling groups_plugins_play to load vars for managed_node2 29946 1726882579.41282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882579.42583: done with get_vars() 29946 1726882579.42601: done getting variables 29946 1726882579.42657: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Friday 20 September 2024 21:36:19 -0400 (0:00:00.051) 0:00:05.536 ****** 29946 1726882579.42689: entering _queue_task() for managed_node2/debug 29946 1726882579.43535: worker is 1 (out of 1 available) 29946 1726882579.43547: exiting _queue_task() for managed_node2/debug 29946 1726882579.43558: done queuing things up, now waiting for results queue to drain 29946 1726882579.43559: waiting for pending results... 29946 1726882579.44358: running TaskExecutor() for managed_node2/TASK: Show current_interfaces 29946 1726882579.44444: in run() - task 12673a56-9f93-95e7-9dfb-000000000276 29946 1726882579.44799: variable 'ansible_search_path' from source: unknown 29946 1726882579.44803: variable 'ansible_search_path' from source: unknown 29946 1726882579.44805: calling self._execute() 29946 1726882579.45194: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.45200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.45205: variable 'omit' from source: magic vars 29946 1726882579.45533: variable 'ansible_distribution_major_version' from source: facts 29946 1726882579.45756: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882579.45768: variable 'omit' from source: magic vars 29946 1726882579.45819: variable 'omit' from source: magic vars 29946 1726882579.45919: variable 'current_interfaces' from source: set_fact 29946 1726882579.46218: variable 'omit' from source: magic vars 29946 1726882579.46221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882579.46224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882579.46316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882579.46344: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.46361: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.46397: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882579.46441: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.46450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.46683: Set connection var ansible_pipelining to False 29946 1726882579.46697: Set connection var ansible_shell_executable to /bin/sh 29946 1726882579.46709: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882579.46745: Set connection var ansible_timeout to 10 29946 1726882579.46869: Set connection var ansible_shell_type to sh 29946 1726882579.47036: Set connection var ansible_connection to ssh 29946 1726882579.47039: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.47042: variable 'ansible_connection' from source: unknown 29946 1726882579.47044: variable 'ansible_module_compression' from source: unknown 29946 1726882579.47046: variable 'ansible_shell_type' from source: unknown 29946 1726882579.47047: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.47049: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.47051: variable 'ansible_pipelining' from source: unknown 29946 1726882579.47053: variable 'ansible_timeout' from source: unknown 29946 1726882579.47055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.47370: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882579.47525: variable 'omit' from source: magic vars 29946 1726882579.47528: starting attempt loop 29946 1726882579.47531: running the handler 29946 1726882579.47534: handler run complete 29946 1726882579.47536: attempt loop complete, returning result 29946 1726882579.47538: _execute() done 29946 1726882579.47539: dumping result to json 29946 1726882579.47541: done dumping result, returning 29946 1726882579.47544: done running TaskExecutor() for managed_node2/TASK: Show current_interfaces [12673a56-9f93-95e7-9dfb-000000000276] 29946 1726882579.47547: sending task result for task 12673a56-9f93-95e7-9dfb-000000000276 29946 1726882579.47808: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000276 29946 1726882579.47812: WORKER PROCESS EXITING ok: [managed_node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 29946 1726882579.47894: no more pending results, returning what we have 29946 1726882579.47898: results queue empty 29946 1726882579.47900: checking for any_errors_fatal 29946 1726882579.47907: done checking for any_errors_fatal 29946 1726882579.47908: checking for max_fail_percentage 29946 1726882579.47910: done checking for max_fail_percentage 29946 1726882579.47910: checking to see if all hosts have failed and the running result is not ok 29946 1726882579.47911: done checking to see if all hosts have failed 29946 1726882579.47912: getting the remaining hosts for this loop 29946 1726882579.47913: done getting the remaining hosts for this loop 29946 1726882579.47917: getting the next task for host managed_node2 29946 1726882579.47926: done getting next task for host managed_node2 29946 1726882579.47929: ^ task is: TASK: Install iproute 29946 1726882579.47932: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882579.47936: getting variables 29946 1726882579.47938: in VariableManager get_vars() 29946 1726882579.47974: Calling all_inventory to load vars for managed_node2 29946 1726882579.47976: Calling groups_inventory to load vars for managed_node2 29946 1726882579.47978: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882579.47989: Calling all_plugins_play to load vars for managed_node2 29946 1726882579.47992: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882579.47997: Calling groups_plugins_play to load vars for managed_node2 29946 1726882579.48822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882579.49412: done with get_vars() 29946 1726882579.49424: done getting variables 29946 1726882579.49483: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Friday 20 September 2024 21:36:19 -0400 (0:00:00.068) 0:00:05.604 ****** 29946 1726882579.49518: entering _queue_task() for managed_node2/package 29946 1726882579.50212: worker is 1 (out of 1 available) 29946 1726882579.50222: exiting _queue_task() for managed_node2/package 29946 1726882579.50233: done queuing things up, now waiting for results queue to drain 29946 1726882579.50234: waiting for pending results... 29946 1726882579.50661: running TaskExecutor() for managed_node2/TASK: Install iproute 29946 1726882579.50774: in run() - task 12673a56-9f93-95e7-9dfb-0000000001cf 29946 1726882579.50882: variable 'ansible_search_path' from source: unknown 29946 1726882579.50886: variable 'ansible_search_path' from source: unknown 29946 1726882579.50889: calling self._execute() 29946 1726882579.51500: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.51507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.51509: variable 'omit' from source: magic vars 29946 1726882579.51823: variable 'ansible_distribution_major_version' from source: facts 29946 1726882579.51973: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882579.51983: variable 'omit' from source: magic vars 29946 1726882579.52027: variable 'omit' from source: magic vars 29946 1726882579.52392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882579.57718: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882579.57722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882579.57725: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882579.58498: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882579.58502: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882579.58505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882579.58508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882579.58510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882579.59098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882579.59101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882579.59104: variable '__network_is_ostree' from source: set_fact 29946 1726882579.59106: variable 'omit' from source: magic vars 29946 1726882579.59108: variable 'omit' from source: magic vars 29946 1726882579.59320: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882579.59351: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882579.59698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882579.59702: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.59705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882579.59707: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882579.59709: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.59711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.59976: Set connection var ansible_pipelining to False 29946 1726882579.59987: Set connection var ansible_shell_executable to /bin/sh 29946 1726882579.60002: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882579.60014: Set connection var ansible_timeout to 10 29946 1726882579.60026: Set connection var ansible_shell_type to sh 29946 1726882579.60203: Set connection var ansible_connection to ssh 29946 1726882579.60233: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.60251: variable 'ansible_connection' from source: unknown 29946 1726882579.60259: variable 'ansible_module_compression' from source: unknown 29946 1726882579.60266: variable 'ansible_shell_type' from source: unknown 29946 1726882579.60698: variable 'ansible_shell_executable' from source: unknown 29946 1726882579.60701: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882579.60704: variable 'ansible_pipelining' from source: unknown 29946 1726882579.60706: variable 'ansible_timeout' from source: unknown 29946 1726882579.60708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882579.60711: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882579.60714: variable 'omit' from source: magic vars 29946 1726882579.60715: starting attempt loop 29946 1726882579.60718: running the handler 29946 1726882579.60720: variable 'ansible_facts' from source: unknown 29946 1726882579.60726: variable 'ansible_facts' from source: unknown 29946 1726882579.60729: _low_level_execute_command(): starting 29946 1726882579.60731: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882579.62611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.62671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.63108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.64717: stdout chunk (state=3): >>>/root <<< 29946 1726882579.64805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.64842: stderr chunk (state=3): >>><<< 29946 1726882579.64852: stdout chunk (state=3): >>><<< 29946 1726882579.64880: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.64908: _low_level_execute_command(): starting 29946 1726882579.64918: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719 `" && echo ansible-tmp-1726882579.6489596-30215-181725945213719="` echo /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719 `" ) && sleep 0' 29946 1726882579.66380: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882579.66440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882579.66465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882579.66496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882579.66607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882579.68506: stdout chunk (state=3): >>>ansible-tmp-1726882579.6489596-30215-181725945213719=/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719 <<< 29946 1726882579.68615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882579.68706: stderr chunk (state=3): >>><<< 29946 1726882579.68782: stdout chunk (state=3): >>><<< 29946 1726882579.68902: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882579.6489596-30215-181725945213719=/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882579.68906: variable 'ansible_module_compression' from source: unknown 29946 1726882579.69022: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 29946 1726882579.69029: ANSIBALLZ: Acquiring lock 29946 1726882579.69034: ANSIBALLZ: Lock acquired: 140626579263984 29946 1726882579.69040: ANSIBALLZ: Creating module 29946 1726882580.01428: ANSIBALLZ: Writing module into payload 29946 1726882580.01601: ANSIBALLZ: Writing module 29946 1726882580.01620: ANSIBALLZ: Renaming module 29946 1726882580.01631: ANSIBALLZ: Done creating module 29946 1726882580.01651: variable 'ansible_facts' from source: unknown 29946 1726882580.01750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py 29946 1726882580.01915: Sending initial data 29946 1726882580.01919: Sent initial data (152 bytes) 29946 1726882580.02449: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.02465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882580.02521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.02616: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.02644: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.02749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.04542: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882580.04546: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882580.04625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpnqtwglhp /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py <<< 29946 1726882580.04629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py" <<< 29946 1726882580.04729: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpnqtwglhp" to remote "/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py" <<< 29946 1726882580.06043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.06089: stderr chunk (state=3): >>><<< 29946 1726882580.06096: stdout chunk (state=3): >>><<< 29946 1726882580.06156: done transferring module to remote 29946 1726882580.06167: _low_level_execute_command(): starting 29946 1726882580.06171: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/ /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py && sleep 0' 29946 1726882580.06791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.06901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882580.06904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.06964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.07037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.07048: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.07064: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.07154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.08929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.08932: stdout chunk (state=3): >>><<< 29946 1726882580.08939: stderr chunk (state=3): >>><<< 29946 1726882580.09099: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.09102: _low_level_execute_command(): starting 29946 1726882580.09105: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/AnsiballZ_dnf.py && sleep 0' 29946 1726882580.09601: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.09618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882580.09634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.09651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882580.09713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.09763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.09779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.09809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.09907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.49915: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 29946 1726882580.54004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882580.54009: stdout chunk (state=3): >>><<< 29946 1726882580.54011: stderr chunk (state=3): >>><<< 29946 1726882580.54015: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882580.54017: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882580.54024: _low_level_execute_command(): starting 29946 1726882580.54026: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882579.6489596-30215-181725945213719/ > /dev/null 2>&1 && sleep 0' 29946 1726882580.54732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.54763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.54857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.56900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.56904: stdout chunk (state=3): >>><<< 29946 1726882580.56906: stderr chunk (state=3): >>><<< 29946 1726882580.56909: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.56911: handler run complete 29946 1726882580.56974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882580.57190: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882580.57256: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882580.57303: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882580.57342: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882580.57435: variable '__install_status' from source: unknown 29946 1726882580.57459: Evaluated conditional (__install_status is success): True 29946 1726882580.57496: attempt loop complete, returning result 29946 1726882580.57504: _execute() done 29946 1726882580.57511: dumping result to json 29946 1726882580.57521: done dumping result, returning 29946 1726882580.57581: done running TaskExecutor() for managed_node2/TASK: Install iproute [12673a56-9f93-95e7-9dfb-0000000001cf] 29946 1726882580.57584: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cf ok: [managed_node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 29946 1726882580.57889: no more pending results, returning what we have 29946 1726882580.57973: results queue empty 29946 1726882580.58010: checking for any_errors_fatal 29946 1726882580.58017: done checking for any_errors_fatal 29946 1726882580.58018: checking for max_fail_percentage 29946 1726882580.58020: done checking for max_fail_percentage 29946 1726882580.58021: checking to see if all hosts have failed and the running result is not ok 29946 1726882580.58021: done checking to see if all hosts have failed 29946 1726882580.58022: getting the remaining hosts for this loop 29946 1726882580.58024: done getting the remaining hosts for this loop 29946 1726882580.58027: getting the next task for host managed_node2 29946 1726882580.58034: done getting next task for host managed_node2 29946 1726882580.58037: ^ task is: TASK: Create veth interface {{ interface }} 29946 1726882580.58040: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882580.58044: getting variables 29946 1726882580.58046: in VariableManager get_vars() 29946 1726882580.58083: Calling all_inventory to load vars for managed_node2 29946 1726882580.58089: Calling groups_inventory to load vars for managed_node2 29946 1726882580.58092: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882580.58238: Calling all_plugins_play to load vars for managed_node2 29946 1726882580.58242: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882580.58301: Calling groups_plugins_play to load vars for managed_node2 29946 1726882580.58759: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001cf 29946 1726882580.58762: WORKER PROCESS EXITING 29946 1726882580.58785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882580.59012: done with get_vars() 29946 1726882580.59023: done getting variables 29946 1726882580.59077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882580.59191: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Friday 20 September 2024 21:36:20 -0400 (0:00:01.097) 0:00:06.701 ****** 29946 1726882580.59225: entering _queue_task() for managed_node2/command 29946 1726882580.59476: worker is 1 (out of 1 available) 29946 1726882580.59487: exiting _queue_task() for managed_node2/command 29946 1726882580.59499: done queuing things up, now waiting for results queue to drain 29946 1726882580.59500: waiting for pending results... 29946 1726882580.59733: running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 29946 1726882580.59830: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d0 29946 1726882580.59848: variable 'ansible_search_path' from source: unknown 29946 1726882580.59855: variable 'ansible_search_path' from source: unknown 29946 1726882580.60101: variable 'interface' from source: set_fact 29946 1726882580.60188: variable 'interface' from source: set_fact 29946 1726882580.60267: variable 'interface' from source: set_fact 29946 1726882580.60405: Loaded config def from plugin (lookup/items) 29946 1726882580.60420: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 29946 1726882580.60445: variable 'omit' from source: magic vars 29946 1726882580.60566: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.60584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.60601: variable 'omit' from source: magic vars 29946 1726882580.61145: variable 'ansible_distribution_major_version' from source: facts 29946 1726882580.61158: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882580.61350: variable 'type' from source: set_fact 29946 1726882580.61359: variable 'state' from source: include params 29946 1726882580.61367: variable 'interface' from source: set_fact 29946 1726882580.61374: variable 'current_interfaces' from source: set_fact 29946 1726882580.61386: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29946 1726882580.61598: variable 'omit' from source: magic vars 29946 1726882580.61602: variable 'omit' from source: magic vars 29946 1726882580.61604: variable 'item' from source: unknown 29946 1726882580.61606: variable 'item' from source: unknown 29946 1726882580.61608: variable 'omit' from source: magic vars 29946 1726882580.61610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882580.61621: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882580.61641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882580.61661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882580.61675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882580.61708: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882580.61716: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.61728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.61829: Set connection var ansible_pipelining to False 29946 1726882580.61842: Set connection var ansible_shell_executable to /bin/sh 29946 1726882580.61851: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882580.61860: Set connection var ansible_timeout to 10 29946 1726882580.61871: Set connection var ansible_shell_type to sh 29946 1726882580.61878: Set connection var ansible_connection to ssh 29946 1726882580.61902: variable 'ansible_shell_executable' from source: unknown 29946 1726882580.61909: variable 'ansible_connection' from source: unknown 29946 1726882580.61916: variable 'ansible_module_compression' from source: unknown 29946 1726882580.61922: variable 'ansible_shell_type' from source: unknown 29946 1726882580.61928: variable 'ansible_shell_executable' from source: unknown 29946 1726882580.61935: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.61946: variable 'ansible_pipelining' from source: unknown 29946 1726882580.61953: variable 'ansible_timeout' from source: unknown 29946 1726882580.61960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.62087: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882580.62104: variable 'omit' from source: magic vars 29946 1726882580.62116: starting attempt loop 29946 1726882580.62162: running the handler 29946 1726882580.62165: _low_level_execute_command(): starting 29946 1726882580.62168: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882580.62847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.62864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882580.62879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.62941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.62998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.63018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.63046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.63149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.64780: stdout chunk (state=3): >>>/root <<< 29946 1726882580.64931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.64934: stdout chunk (state=3): >>><<< 29946 1726882580.64937: stderr chunk (state=3): >>><<< 29946 1726882580.64959: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.65058: _low_level_execute_command(): starting 29946 1726882580.65061: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969 `" && echo ansible-tmp-1726882580.6497216-30282-260733233829969="` echo /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969 `" ) && sleep 0' 29946 1726882580.65605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.65714: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.65727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.65746: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.65837: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.67741: stdout chunk (state=3): >>>ansible-tmp-1726882580.6497216-30282-260733233829969=/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969 <<< 29946 1726882580.67904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.67907: stdout chunk (state=3): >>><<< 29946 1726882580.67910: stderr chunk (state=3): >>><<< 29946 1726882580.67927: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882580.6497216-30282-260733233829969=/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.68000: variable 'ansible_module_compression' from source: unknown 29946 1726882580.68023: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882580.68065: variable 'ansible_facts' from source: unknown 29946 1726882580.68162: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py 29946 1726882580.68398: Sending initial data 29946 1726882580.68401: Sent initial data (156 bytes) 29946 1726882580.68999: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.69016: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.69061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.69088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.69118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.69204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.70729: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29946 1726882580.70744: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882580.70804: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882580.70866: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpw610btyh /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py <<< 29946 1726882580.70869: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py" <<< 29946 1726882580.70928: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpw610btyh" to remote "/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py" <<< 29946 1726882580.71545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.71646: stderr chunk (state=3): >>><<< 29946 1726882580.71650: stdout chunk (state=3): >>><<< 29946 1726882580.71652: done transferring module to remote 29946 1726882580.71654: _low_level_execute_command(): starting 29946 1726882580.71656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/ /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py && sleep 0' 29946 1726882580.72231: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882580.72237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882580.72240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.72242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882580.72245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882580.72252: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882580.72317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.72354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.72371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.72377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.72476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.74209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.74228: stderr chunk (state=3): >>><<< 29946 1726882580.74231: stdout chunk (state=3): >>><<< 29946 1726882580.74242: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.74245: _low_level_execute_command(): starting 29946 1726882580.74251: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/AnsiballZ_command.py && sleep 0' 29946 1726882580.74775: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882580.74779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.74782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.74784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882580.74821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.74896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.90437: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:36:20.897176", "end": "2024-09-20 21:36:20.903100", "delta": "0:00:00.005924", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882580.92854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882580.92880: stderr chunk (state=3): >>><<< 29946 1726882580.92883: stdout chunk (state=3): >>><<< 29946 1726882580.92903: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-20 21:36:20.897176", "end": "2024-09-20 21:36:20.903100", "delta": "0:00:00.005924", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882580.92935: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882580.92943: _low_level_execute_command(): starting 29946 1726882580.92946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882580.6497216-30282-260733233829969/ > /dev/null 2>&1 && sleep 0' 29946 1726882580.93362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.93398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882580.93402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.93404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882580.93406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882580.93449: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882580.93452: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882580.93720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882580.98443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882580.98463: stderr chunk (state=3): >>><<< 29946 1726882580.98466: stdout chunk (state=3): >>><<< 29946 1726882580.98479: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882580.98485: handler run complete 29946 1726882580.98508: Evaluated conditional (False): False 29946 1726882580.98516: attempt loop complete, returning result 29946 1726882580.98535: variable 'item' from source: unknown 29946 1726882580.98598: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.005924", "end": "2024-09-20 21:36:20.903100", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-20 21:36:20.897176" } 29946 1726882580.98756: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.98759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.98762: variable 'omit' from source: magic vars 29946 1726882580.99107: variable 'ansible_distribution_major_version' from source: facts 29946 1726882580.99110: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882580.99113: variable 'type' from source: set_fact 29946 1726882580.99115: variable 'state' from source: include params 29946 1726882580.99117: variable 'interface' from source: set_fact 29946 1726882580.99120: variable 'current_interfaces' from source: set_fact 29946 1726882580.99122: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29946 1726882580.99124: variable 'omit' from source: magic vars 29946 1726882580.99126: variable 'omit' from source: magic vars 29946 1726882580.99300: variable 'item' from source: unknown 29946 1726882580.99303: variable 'item' from source: unknown 29946 1726882580.99305: variable 'omit' from source: magic vars 29946 1726882580.99323: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882580.99330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882580.99337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882580.99351: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882580.99354: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.99357: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.99439: Set connection var ansible_pipelining to False 29946 1726882580.99446: Set connection var ansible_shell_executable to /bin/sh 29946 1726882580.99449: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882580.99457: Set connection var ansible_timeout to 10 29946 1726882580.99459: Set connection var ansible_shell_type to sh 29946 1726882580.99461: Set connection var ansible_connection to ssh 29946 1726882580.99542: variable 'ansible_shell_executable' from source: unknown 29946 1726882580.99545: variable 'ansible_connection' from source: unknown 29946 1726882580.99547: variable 'ansible_module_compression' from source: unknown 29946 1726882580.99550: variable 'ansible_shell_type' from source: unknown 29946 1726882580.99552: variable 'ansible_shell_executable' from source: unknown 29946 1726882580.99554: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882580.99556: variable 'ansible_pipelining' from source: unknown 29946 1726882580.99558: variable 'ansible_timeout' from source: unknown 29946 1726882580.99561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882580.99587: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882580.99600: variable 'omit' from source: magic vars 29946 1726882580.99603: starting attempt loop 29946 1726882580.99605: running the handler 29946 1726882580.99616: _low_level_execute_command(): starting 29946 1726882580.99618: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882581.00199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.00207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.00218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.00232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.00245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882581.00251: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882581.00261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.00274: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882581.00282: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882581.00292: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29946 1726882581.00305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.00319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.00330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.00337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882581.00344: stderr chunk (state=3): >>>debug2: match found <<< 29946 1726882581.00353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.00422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.00470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.00473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.00534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.02129: stdout chunk (state=3): >>>/root <<< 29946 1726882581.02223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.02277: stderr chunk (state=3): >>><<< 29946 1726882581.02291: stdout chunk (state=3): >>><<< 29946 1726882581.02391: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.02397: _low_level_execute_command(): starting 29946 1726882581.02400: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124 `" && echo ansible-tmp-1726882581.0231574-30282-267770209106124="` echo /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124 `" ) && sleep 0' 29946 1726882581.02957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.02970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.03006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.03021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882581.03058: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.03127: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.03160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.03243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.05132: stdout chunk (state=3): >>>ansible-tmp-1726882581.0231574-30282-267770209106124=/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124 <<< 29946 1726882581.05290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.05296: stdout chunk (state=3): >>><<< 29946 1726882581.05298: stderr chunk (state=3): >>><<< 29946 1726882581.05499: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882581.0231574-30282-267770209106124=/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.05502: variable 'ansible_module_compression' from source: unknown 29946 1726882581.05504: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882581.05506: variable 'ansible_facts' from source: unknown 29946 1726882581.05508: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py 29946 1726882581.05714: Sending initial data 29946 1726882581.05723: Sent initial data (156 bytes) 29946 1726882581.06892: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.06946: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.07117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.08590: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29946 1726882581.08620: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882581.08679: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882581.08745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpo4b8i0pd /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py <<< 29946 1726882581.08770: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py" <<< 29946 1726882581.08837: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpo4b8i0pd" to remote "/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py" <<< 29946 1726882581.10113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.10117: stdout chunk (state=3): >>><<< 29946 1726882581.10119: stderr chunk (state=3): >>><<< 29946 1726882581.10121: done transferring module to remote 29946 1726882581.10123: _low_level_execute_command(): starting 29946 1726882581.10125: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/ /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py && sleep 0' 29946 1726882581.11200: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.11496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.11520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.11621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.13399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.13439: stderr chunk (state=3): >>><<< 29946 1726882581.13609: stdout chunk (state=3): >>><<< 29946 1726882581.13627: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.13631: _low_level_execute_command(): starting 29946 1726882581.13636: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/AnsiballZ_command.py && sleep 0' 29946 1726882581.14764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.14780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.14797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.14883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.15039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.15123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.30643: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:36:21.301325", "end": "2024-09-20 21:36:21.305266", "delta": "0:00:00.003941", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882581.32198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882581.32203: stdout chunk (state=3): >>><<< 29946 1726882581.32206: stderr chunk (state=3): >>><<< 29946 1726882581.32302: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-20 21:36:21.301325", "end": "2024-09-20 21:36:21.305266", "delta": "0:00:00.003941", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882581.32312: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882581.32315: _low_level_execute_command(): starting 29946 1726882581.32317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882581.0231574-30282-267770209106124/ > /dev/null 2>&1 && sleep 0' 29946 1726882581.33910: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.33985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.33988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.33990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.33992: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882581.34087: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.34222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.34229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.34320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.36473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.36477: stdout chunk (state=3): >>><<< 29946 1726882581.36484: stderr chunk (state=3): >>><<< 29946 1726882581.36504: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.36510: handler run complete 29946 1726882581.36530: Evaluated conditional (False): False 29946 1726882581.36698: attempt loop complete, returning result 29946 1726882581.36701: variable 'item' from source: unknown 29946 1726882581.36753: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003941", "end": "2024-09-20 21:36:21.305266", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-20 21:36:21.301325" } 29946 1726882581.37198: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.37201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.37203: variable 'omit' from source: magic vars 29946 1726882581.37600: variable 'ansible_distribution_major_version' from source: facts 29946 1726882581.37604: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882581.37676: variable 'type' from source: set_fact 29946 1726882581.37690: variable 'state' from source: include params 29946 1726882581.37705: variable 'interface' from source: set_fact 29946 1726882581.37717: variable 'current_interfaces' from source: set_fact 29946 1726882581.37733: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 29946 1726882581.37746: variable 'omit' from source: magic vars 29946 1726882581.37766: variable 'omit' from source: magic vars 29946 1726882581.37813: variable 'item' from source: unknown 29946 1726882581.37897: variable 'item' from source: unknown 29946 1726882581.37916: variable 'omit' from source: magic vars 29946 1726882581.37950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882581.37969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882581.37982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882581.38041: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882581.38044: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.38047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.38120: Set connection var ansible_pipelining to False 29946 1726882581.38132: Set connection var ansible_shell_executable to /bin/sh 29946 1726882581.38145: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882581.38160: Set connection var ansible_timeout to 10 29946 1726882581.38184: Set connection var ansible_shell_type to sh 29946 1726882581.38262: Set connection var ansible_connection to ssh 29946 1726882581.38266: variable 'ansible_shell_executable' from source: unknown 29946 1726882581.38268: variable 'ansible_connection' from source: unknown 29946 1726882581.38270: variable 'ansible_module_compression' from source: unknown 29946 1726882581.38279: variable 'ansible_shell_type' from source: unknown 29946 1726882581.38282: variable 'ansible_shell_executable' from source: unknown 29946 1726882581.38284: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.38289: variable 'ansible_pipelining' from source: unknown 29946 1726882581.38291: variable 'ansible_timeout' from source: unknown 29946 1726882581.38294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.38374: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882581.38406: variable 'omit' from source: magic vars 29946 1726882581.38417: starting attempt loop 29946 1726882581.38423: running the handler 29946 1726882581.38480: _low_level_execute_command(): starting 29946 1726882581.38483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882581.39099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.39113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.39131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.39252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.39276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.39488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.40976: stdout chunk (state=3): >>>/root <<< 29946 1726882581.41123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.41126: stdout chunk (state=3): >>><<< 29946 1726882581.41128: stderr chunk (state=3): >>><<< 29946 1726882581.41147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.41167: _low_level_execute_command(): starting 29946 1726882581.41176: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836 `" && echo ansible-tmp-1726882581.4115813-30282-176145809380836="` echo /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836 `" ) && sleep 0' 29946 1726882581.41773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.41791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.41821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.41847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.41944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.42004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.42073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.43934: stdout chunk (state=3): >>>ansible-tmp-1726882581.4115813-30282-176145809380836=/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836 <<< 29946 1726882581.44036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.44499: stderr chunk (state=3): >>><<< 29946 1726882581.44502: stdout chunk (state=3): >>><<< 29946 1726882581.44505: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882581.4115813-30282-176145809380836=/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.44508: variable 'ansible_module_compression' from source: unknown 29946 1726882581.44510: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882581.44512: variable 'ansible_facts' from source: unknown 29946 1726882581.44514: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py 29946 1726882581.44644: Sending initial data 29946 1726882581.44746: Sent initial data (156 bytes) 29946 1726882581.45643: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.45647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882581.45650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.45656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.45659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.45869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.45904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.45924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.46027: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.47602: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882581.47680: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882581.47756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpdyiw6hh8 /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py <<< 29946 1726882581.47764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py" <<< 29946 1726882581.47819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpdyiw6hh8" to remote "/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py" <<< 29946 1726882581.49438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.49441: stderr chunk (state=3): >>><<< 29946 1726882581.49444: stdout chunk (state=3): >>><<< 29946 1726882581.49446: done transferring module to remote 29946 1726882581.49448: _low_level_execute_command(): starting 29946 1726882581.49465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/ /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py && sleep 0' 29946 1726882581.50268: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.50282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.50339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.50408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.50445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.50463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.50556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.52473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.52509: stdout chunk (state=3): >>><<< 29946 1726882581.52530: stderr chunk (state=3): >>><<< 29946 1726882581.52698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.52701: _low_level_execute_command(): starting 29946 1726882581.52704: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/AnsiballZ_command.py && sleep 0' 29946 1726882581.53492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.53502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.53675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.53679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.53682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882581.53685: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.53688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.53691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.53743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.69383: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:36:21.688709", "end": "2024-09-20 21:36:21.692380", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882581.70933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882581.70961: stderr chunk (state=3): >>><<< 29946 1726882581.70965: stdout chunk (state=3): >>><<< 29946 1726882581.71106: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-20 21:36:21.688709", "end": "2024-09-20 21:36:21.692380", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882581.71110: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882581.71113: _low_level_execute_command(): starting 29946 1726882581.71115: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882581.4115813-30282-176145809380836/ > /dev/null 2>&1 && sleep 0' 29946 1726882581.71733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.71737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.71755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.71822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.71843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.71873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.71973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.73918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.73921: stdout chunk (state=3): >>><<< 29946 1726882581.73924: stderr chunk (state=3): >>><<< 29946 1726882581.74302: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.74306: handler run complete 29946 1726882581.74308: Evaluated conditional (False): False 29946 1726882581.74310: attempt loop complete, returning result 29946 1726882581.74312: variable 'item' from source: unknown 29946 1726882581.74337: variable 'item' from source: unknown ok: [managed_node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.003671", "end": "2024-09-20 21:36:21.692380", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-20 21:36:21.688709" } 29946 1726882581.74519: dumping result to json 29946 1726882581.74523: done dumping result, returning 29946 1726882581.74525: done running TaskExecutor() for managed_node2/TASK: Create veth interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d0] 29946 1726882581.74527: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d0 29946 1726882581.75589: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d0 29946 1726882581.75592: WORKER PROCESS EXITING 29946 1726882581.75727: no more pending results, returning what we have 29946 1726882581.75730: results queue empty 29946 1726882581.75731: checking for any_errors_fatal 29946 1726882581.75734: done checking for any_errors_fatal 29946 1726882581.75742: checking for max_fail_percentage 29946 1726882581.75744: done checking for max_fail_percentage 29946 1726882581.75745: checking to see if all hosts have failed and the running result is not ok 29946 1726882581.75745: done checking to see if all hosts have failed 29946 1726882581.75746: getting the remaining hosts for this loop 29946 1726882581.75747: done getting the remaining hosts for this loop 29946 1726882581.75750: getting the next task for host managed_node2 29946 1726882581.75755: done getting next task for host managed_node2 29946 1726882581.75766: ^ task is: TASK: Set up veth as managed by NetworkManager 29946 1726882581.75769: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882581.75772: getting variables 29946 1726882581.75773: in VariableManager get_vars() 29946 1726882581.75799: Calling all_inventory to load vars for managed_node2 29946 1726882581.75801: Calling groups_inventory to load vars for managed_node2 29946 1726882581.75803: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882581.75812: Calling all_plugins_play to load vars for managed_node2 29946 1726882581.75815: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882581.75818: Calling groups_plugins_play to load vars for managed_node2 29946 1726882581.75998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882581.76225: done with get_vars() 29946 1726882581.76234: done getting variables 29946 1726882581.76287: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Friday 20 September 2024 21:36:21 -0400 (0:00:01.170) 0:00:07.872 ****** 29946 1726882581.76325: entering _queue_task() for managed_node2/command 29946 1726882581.76580: worker is 1 (out of 1 available) 29946 1726882581.76591: exiting _queue_task() for managed_node2/command 29946 1726882581.76605: done queuing things up, now waiting for results queue to drain 29946 1726882581.76607: waiting for pending results... 29946 1726882581.77007: running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager 29946 1726882581.77012: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d1 29946 1726882581.77015: variable 'ansible_search_path' from source: unknown 29946 1726882581.77018: variable 'ansible_search_path' from source: unknown 29946 1726882581.77021: calling self._execute() 29946 1726882581.77088: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.77104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.77120: variable 'omit' from source: magic vars 29946 1726882581.77512: variable 'ansible_distribution_major_version' from source: facts 29946 1726882581.77534: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882581.77712: variable 'type' from source: set_fact 29946 1726882581.77722: variable 'state' from source: include params 29946 1726882581.77732: Evaluated conditional (type == 'veth' and state == 'present'): True 29946 1726882581.77748: variable 'omit' from source: magic vars 29946 1726882581.77798: variable 'omit' from source: magic vars 29946 1726882581.77899: variable 'interface' from source: set_fact 29946 1726882581.77920: variable 'omit' from source: magic vars 29946 1726882581.77962: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882581.78007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882581.78031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882581.78051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882581.78067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882581.78104: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882581.78413: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.78417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.78431: Set connection var ansible_pipelining to False 29946 1726882581.78443: Set connection var ansible_shell_executable to /bin/sh 29946 1726882581.78452: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882581.78462: Set connection var ansible_timeout to 10 29946 1726882581.78472: Set connection var ansible_shell_type to sh 29946 1726882581.78479: Set connection var ansible_connection to ssh 29946 1726882581.78518: variable 'ansible_shell_executable' from source: unknown 29946 1726882581.78724: variable 'ansible_connection' from source: unknown 29946 1726882581.78727: variable 'ansible_module_compression' from source: unknown 29946 1726882581.78730: variable 'ansible_shell_type' from source: unknown 29946 1726882581.78732: variable 'ansible_shell_executable' from source: unknown 29946 1726882581.78734: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882581.78737: variable 'ansible_pipelining' from source: unknown 29946 1726882581.78739: variable 'ansible_timeout' from source: unknown 29946 1726882581.78741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882581.78930: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882581.78950: variable 'omit' from source: magic vars 29946 1726882581.78960: starting attempt loop 29946 1726882581.78966: running the handler 29946 1726882581.78985: _low_level_execute_command(): starting 29946 1726882581.79060: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882581.79931: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.79944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.79957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.79975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.79997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882581.80010: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882581.80113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.80135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.80233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.81937: stdout chunk (state=3): >>>/root <<< 29946 1726882581.82303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.82306: stdout chunk (state=3): >>><<< 29946 1726882581.82309: stderr chunk (state=3): >>><<< 29946 1726882581.82312: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.82314: _low_level_execute_command(): starting 29946 1726882581.82316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662 `" && echo ansible-tmp-1726882581.82169-30352-182303662716662="` echo /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662 `" ) && sleep 0' 29946 1726882581.83611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.83813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.83825: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.83900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.85805: stdout chunk (state=3): >>>ansible-tmp-1726882581.82169-30352-182303662716662=/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662 <<< 29946 1726882581.85905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.85938: stderr chunk (state=3): >>><<< 29946 1726882581.86304: stdout chunk (state=3): >>><<< 29946 1726882581.86308: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882581.82169-30352-182303662716662=/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.86311: variable 'ansible_module_compression' from source: unknown 29946 1726882581.86313: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882581.86315: variable 'ansible_facts' from source: unknown 29946 1726882581.86452: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py 29946 1726882581.86763: Sending initial data 29946 1726882581.86766: Sent initial data (154 bytes) 29946 1726882581.87707: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.87785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.87813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.87842: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.87921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.89530: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882581.89552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882581.89724: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp186gc452 /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py <<< 29946 1726882581.89727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py" <<< 29946 1726882581.89770: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp186gc452" to remote "/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py" <<< 29946 1726882581.90897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.90972: stderr chunk (state=3): >>><<< 29946 1726882581.90990: stdout chunk (state=3): >>><<< 29946 1726882581.91057: done transferring module to remote 29946 1726882581.91080: _low_level_execute_command(): starting 29946 1726882581.91098: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/ /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py && sleep 0' 29946 1726882581.91710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.91816: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.91874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882581.93670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882581.93674: stdout chunk (state=3): >>><<< 29946 1726882581.93676: stderr chunk (state=3): >>><<< 29946 1726882581.93699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882581.93788: _low_level_execute_command(): starting 29946 1726882581.93792: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/AnsiballZ_command.py && sleep 0' 29946 1726882581.94267: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882581.94285: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882581.94305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882581.94325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882581.94417: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882581.94445: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882581.94462: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882581.94481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882581.94575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.11649: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:36:22.095496", "end": "2024-09-20 21:36:22.115448", "delta": "0:00:00.019952", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882582.13264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882582.13504: stdout chunk (state=3): >>><<< 29946 1726882582.13507: stderr chunk (state=3): >>><<< 29946 1726882582.13510: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-20 21:36:22.095496", "end": "2024-09-20 21:36:22.115448", "delta": "0:00:00.019952", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882582.13629: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882582.13699: _low_level_execute_command(): starting 29946 1726882582.13702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882581.82169-30352-182303662716662/ > /dev/null 2>&1 && sleep 0' 29946 1726882582.14450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.14463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.14498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.14600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.14622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.14724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.16646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.16657: stdout chunk (state=3): >>><<< 29946 1726882582.16660: stderr chunk (state=3): >>><<< 29946 1726882582.16707: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.16715: handler run complete 29946 1726882582.16775: Evaluated conditional (False): False 29946 1726882582.16896: attempt loop complete, returning result 29946 1726882582.16899: _execute() done 29946 1726882582.16902: dumping result to json 29946 1726882582.16904: done dumping result, returning 29946 1726882582.16906: done running TaskExecutor() for managed_node2/TASK: Set up veth as managed by NetworkManager [12673a56-9f93-95e7-9dfb-0000000001d1] 29946 1726882582.16908: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d1 29946 1726882582.17219: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d1 29946 1726882582.17222: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.019952", "end": "2024-09-20 21:36:22.115448", "rc": 0, "start": "2024-09-20 21:36:22.095496" } 29946 1726882582.17284: no more pending results, returning what we have 29946 1726882582.17289: results queue empty 29946 1726882582.17290: checking for any_errors_fatal 29946 1726882582.17303: done checking for any_errors_fatal 29946 1726882582.17304: checking for max_fail_percentage 29946 1726882582.17306: done checking for max_fail_percentage 29946 1726882582.17306: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.17307: done checking to see if all hosts have failed 29946 1726882582.17308: getting the remaining hosts for this loop 29946 1726882582.17309: done getting the remaining hosts for this loop 29946 1726882582.17313: getting the next task for host managed_node2 29946 1726882582.17318: done getting next task for host managed_node2 29946 1726882582.17320: ^ task is: TASK: Delete veth interface {{ interface }} 29946 1726882582.17323: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.17327: getting variables 29946 1726882582.17329: in VariableManager get_vars() 29946 1726882582.17366: Calling all_inventory to load vars for managed_node2 29946 1726882582.17368: Calling groups_inventory to load vars for managed_node2 29946 1726882582.17370: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.17381: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.17384: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.17391: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.17800: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.18074: done with get_vars() 29946 1726882582.18085: done getting variables 29946 1726882582.18146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882582.18274: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Friday 20 September 2024 21:36:22 -0400 (0:00:00.419) 0:00:08.292 ****** 29946 1726882582.18310: entering _queue_task() for managed_node2/command 29946 1726882582.18589: worker is 1 (out of 1 available) 29946 1726882582.18715: exiting _queue_task() for managed_node2/command 29946 1726882582.18725: done queuing things up, now waiting for results queue to drain 29946 1726882582.18726: waiting for pending results... 29946 1726882582.18884: running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 29946 1726882582.19034: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d2 29946 1726882582.19039: variable 'ansible_search_path' from source: unknown 29946 1726882582.19042: variable 'ansible_search_path' from source: unknown 29946 1726882582.19141: calling self._execute() 29946 1726882582.19178: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.19195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.19210: variable 'omit' from source: magic vars 29946 1726882582.19611: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.19632: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.19864: variable 'type' from source: set_fact 29946 1726882582.19874: variable 'state' from source: include params 29946 1726882582.19882: variable 'interface' from source: set_fact 29946 1726882582.19898: variable 'current_interfaces' from source: set_fact 29946 1726882582.19910: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 29946 1726882582.19998: when evaluation is False, skipping this task 29946 1726882582.20003: _execute() done 29946 1726882582.20006: dumping result to json 29946 1726882582.20009: done dumping result, returning 29946 1726882582.20011: done running TaskExecutor() for managed_node2/TASK: Delete veth interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d2] 29946 1726882582.20014: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d2 29946 1726882582.20076: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d2 29946 1726882582.20079: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29946 1726882582.20134: no more pending results, returning what we have 29946 1726882582.20138: results queue empty 29946 1726882582.20139: checking for any_errors_fatal 29946 1726882582.20150: done checking for any_errors_fatal 29946 1726882582.20151: checking for max_fail_percentage 29946 1726882582.20153: done checking for max_fail_percentage 29946 1726882582.20154: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.20155: done checking to see if all hosts have failed 29946 1726882582.20156: getting the remaining hosts for this loop 29946 1726882582.20158: done getting the remaining hosts for this loop 29946 1726882582.20162: getting the next task for host managed_node2 29946 1726882582.20169: done getting next task for host managed_node2 29946 1726882582.20171: ^ task is: TASK: Create dummy interface {{ interface }} 29946 1726882582.20175: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.20180: getting variables 29946 1726882582.20182: in VariableManager get_vars() 29946 1726882582.20331: Calling all_inventory to load vars for managed_node2 29946 1726882582.20334: Calling groups_inventory to load vars for managed_node2 29946 1726882582.20336: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.20347: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.20350: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.20352: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.20714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.20935: done with get_vars() 29946 1726882582.20953: done getting variables 29946 1726882582.21020: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882582.21143: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Friday 20 September 2024 21:36:22 -0400 (0:00:00.028) 0:00:08.321 ****** 29946 1726882582.21175: entering _queue_task() for managed_node2/command 29946 1726882582.21448: worker is 1 (out of 1 available) 29946 1726882582.21460: exiting _queue_task() for managed_node2/command 29946 1726882582.21473: done queuing things up, now waiting for results queue to drain 29946 1726882582.21475: waiting for pending results... 29946 1726882582.21907: running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 29946 1726882582.21918: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d3 29946 1726882582.21924: variable 'ansible_search_path' from source: unknown 29946 1726882582.21928: variable 'ansible_search_path' from source: unknown 29946 1726882582.21930: calling self._execute() 29946 1726882582.22017: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.22036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.22050: variable 'omit' from source: magic vars 29946 1726882582.22394: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.22411: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.22618: variable 'type' from source: set_fact 29946 1726882582.22683: variable 'state' from source: include params 29946 1726882582.22686: variable 'interface' from source: set_fact 29946 1726882582.22688: variable 'current_interfaces' from source: set_fact 29946 1726882582.22691: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 29946 1726882582.22694: when evaluation is False, skipping this task 29946 1726882582.22697: _execute() done 29946 1726882582.22700: dumping result to json 29946 1726882582.22702: done dumping result, returning 29946 1726882582.22704: done running TaskExecutor() for managed_node2/TASK: Create dummy interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d3] 29946 1726882582.22707: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d3 skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 29946 1726882582.22832: no more pending results, returning what we have 29946 1726882582.22836: results queue empty 29946 1726882582.22837: checking for any_errors_fatal 29946 1726882582.22844: done checking for any_errors_fatal 29946 1726882582.22845: checking for max_fail_percentage 29946 1726882582.22847: done checking for max_fail_percentage 29946 1726882582.22848: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.22849: done checking to see if all hosts have failed 29946 1726882582.22850: getting the remaining hosts for this loop 29946 1726882582.22852: done getting the remaining hosts for this loop 29946 1726882582.22856: getting the next task for host managed_node2 29946 1726882582.22862: done getting next task for host managed_node2 29946 1726882582.22865: ^ task is: TASK: Delete dummy interface {{ interface }} 29946 1726882582.22869: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.22874: getting variables 29946 1726882582.22875: in VariableManager get_vars() 29946 1726882582.22914: Calling all_inventory to load vars for managed_node2 29946 1726882582.22916: Calling groups_inventory to load vars for managed_node2 29946 1726882582.22919: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.22931: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.22934: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.22937: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.23425: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d3 29946 1726882582.23428: WORKER PROCESS EXITING 29946 1726882582.23450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.23658: done with get_vars() 29946 1726882582.23667: done getting variables 29946 1726882582.23728: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882582.23831: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Friday 20 September 2024 21:36:22 -0400 (0:00:00.026) 0:00:08.348 ****** 29946 1726882582.23859: entering _queue_task() for managed_node2/command 29946 1726882582.24134: worker is 1 (out of 1 available) 29946 1726882582.24145: exiting _queue_task() for managed_node2/command 29946 1726882582.24269: done queuing things up, now waiting for results queue to drain 29946 1726882582.24271: waiting for pending results... 29946 1726882582.24366: running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 29946 1726882582.24487: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d4 29946 1726882582.24597: variable 'ansible_search_path' from source: unknown 29946 1726882582.24601: variable 'ansible_search_path' from source: unknown 29946 1726882582.24604: calling self._execute() 29946 1726882582.24635: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.24646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.24658: variable 'omit' from source: magic vars 29946 1726882582.25017: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.25061: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.25187: variable 'type' from source: set_fact 29946 1726882582.25195: variable 'state' from source: include params 29946 1726882582.25199: variable 'interface' from source: set_fact 29946 1726882582.25203: variable 'current_interfaces' from source: set_fact 29946 1726882582.25220: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 29946 1726882582.25223: when evaluation is False, skipping this task 29946 1726882582.25226: _execute() done 29946 1726882582.25228: dumping result to json 29946 1726882582.25230: done dumping result, returning 29946 1726882582.25233: done running TaskExecutor() for managed_node2/TASK: Delete dummy interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d4] 29946 1726882582.25236: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d4 29946 1726882582.25311: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d4 29946 1726882582.25314: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29946 1726882582.25356: no more pending results, returning what we have 29946 1726882582.25360: results queue empty 29946 1726882582.25361: checking for any_errors_fatal 29946 1726882582.25367: done checking for any_errors_fatal 29946 1726882582.25368: checking for max_fail_percentage 29946 1726882582.25369: done checking for max_fail_percentage 29946 1726882582.25370: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.25371: done checking to see if all hosts have failed 29946 1726882582.25372: getting the remaining hosts for this loop 29946 1726882582.25373: done getting the remaining hosts for this loop 29946 1726882582.25376: getting the next task for host managed_node2 29946 1726882582.25381: done getting next task for host managed_node2 29946 1726882582.25383: ^ task is: TASK: Create tap interface {{ interface }} 29946 1726882582.25385: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.25389: getting variables 29946 1726882582.25390: in VariableManager get_vars() 29946 1726882582.25425: Calling all_inventory to load vars for managed_node2 29946 1726882582.25427: Calling groups_inventory to load vars for managed_node2 29946 1726882582.25429: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.25437: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.25439: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.25442: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.25561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.25690: done with get_vars() 29946 1726882582.25700: done getting variables 29946 1726882582.25741: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882582.25813: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Friday 20 September 2024 21:36:22 -0400 (0:00:00.019) 0:00:08.367 ****** 29946 1726882582.25834: entering _queue_task() for managed_node2/command 29946 1726882582.26004: worker is 1 (out of 1 available) 29946 1726882582.26016: exiting _queue_task() for managed_node2/command 29946 1726882582.26028: done queuing things up, now waiting for results queue to drain 29946 1726882582.26029: waiting for pending results... 29946 1726882582.26171: running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 29946 1726882582.26232: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d5 29946 1726882582.26243: variable 'ansible_search_path' from source: unknown 29946 1726882582.26248: variable 'ansible_search_path' from source: unknown 29946 1726882582.26274: calling self._execute() 29946 1726882582.26342: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.26345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.26353: variable 'omit' from source: magic vars 29946 1726882582.26595: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.26605: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.26749: variable 'type' from source: set_fact 29946 1726882582.26752: variable 'state' from source: include params 29946 1726882582.26755: variable 'interface' from source: set_fact 29946 1726882582.26758: variable 'current_interfaces' from source: set_fact 29946 1726882582.26765: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 29946 1726882582.26767: when evaluation is False, skipping this task 29946 1726882582.26770: _execute() done 29946 1726882582.26773: dumping result to json 29946 1726882582.26775: done dumping result, returning 29946 1726882582.26781: done running TaskExecutor() for managed_node2/TASK: Create tap interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d5] 29946 1726882582.26797: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d5 29946 1726882582.27048: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d5 29946 1726882582.27051: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 29946 1726882582.27085: no more pending results, returning what we have 29946 1726882582.27091: results queue empty 29946 1726882582.27092: checking for any_errors_fatal 29946 1726882582.27098: done checking for any_errors_fatal 29946 1726882582.27099: checking for max_fail_percentage 29946 1726882582.27101: done checking for max_fail_percentage 29946 1726882582.27102: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.27103: done checking to see if all hosts have failed 29946 1726882582.27103: getting the remaining hosts for this loop 29946 1726882582.27105: done getting the remaining hosts for this loop 29946 1726882582.27108: getting the next task for host managed_node2 29946 1726882582.27113: done getting next task for host managed_node2 29946 1726882582.27115: ^ task is: TASK: Delete tap interface {{ interface }} 29946 1726882582.27118: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.27123: getting variables 29946 1726882582.27124: in VariableManager get_vars() 29946 1726882582.27153: Calling all_inventory to load vars for managed_node2 29946 1726882582.27155: Calling groups_inventory to load vars for managed_node2 29946 1726882582.27158: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.27166: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.27169: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.27172: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.27408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.27622: done with get_vars() 29946 1726882582.27632: done getting variables 29946 1726882582.27689: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882582.27801: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Friday 20 September 2024 21:36:22 -0400 (0:00:00.019) 0:00:08.387 ****** 29946 1726882582.27828: entering _queue_task() for managed_node2/command 29946 1726882582.28044: worker is 1 (out of 1 available) 29946 1726882582.28057: exiting _queue_task() for managed_node2/command 29946 1726882582.28068: done queuing things up, now waiting for results queue to drain 29946 1726882582.28069: waiting for pending results... 29946 1726882582.28237: running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 29946 1726882582.28299: in run() - task 12673a56-9f93-95e7-9dfb-0000000001d6 29946 1726882582.28311: variable 'ansible_search_path' from source: unknown 29946 1726882582.28316: variable 'ansible_search_path' from source: unknown 29946 1726882582.28341: calling self._execute() 29946 1726882582.28405: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.28410: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.28419: variable 'omit' from source: magic vars 29946 1726882582.28659: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.28668: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.28798: variable 'type' from source: set_fact 29946 1726882582.28801: variable 'state' from source: include params 29946 1726882582.28806: variable 'interface' from source: set_fact 29946 1726882582.28808: variable 'current_interfaces' from source: set_fact 29946 1726882582.28817: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 29946 1726882582.28821: when evaluation is False, skipping this task 29946 1726882582.28824: _execute() done 29946 1726882582.28828: dumping result to json 29946 1726882582.28831: done dumping result, returning 29946 1726882582.28833: done running TaskExecutor() for managed_node2/TASK: Delete tap interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000001d6] 29946 1726882582.28836: sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d6 29946 1726882582.28913: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000001d6 29946 1726882582.28916: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 29946 1726882582.28985: no more pending results, returning what we have 29946 1726882582.28987: results queue empty 29946 1726882582.28988: checking for any_errors_fatal 29946 1726882582.28992: done checking for any_errors_fatal 29946 1726882582.28994: checking for max_fail_percentage 29946 1726882582.28996: done checking for max_fail_percentage 29946 1726882582.28997: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.28998: done checking to see if all hosts have failed 29946 1726882582.28999: getting the remaining hosts for this loop 29946 1726882582.29000: done getting the remaining hosts for this loop 29946 1726882582.29003: getting the next task for host managed_node2 29946 1726882582.29009: done getting next task for host managed_node2 29946 1726882582.29012: ^ task is: TASK: Include the task 'assert_device_present.yml' 29946 1726882582.29014: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.29017: getting variables 29946 1726882582.29018: in VariableManager get_vars() 29946 1726882582.29040: Calling all_inventory to load vars for managed_node2 29946 1726882582.29042: Calling groups_inventory to load vars for managed_node2 29946 1726882582.29043: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.29049: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.29051: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.29052: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.29233: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.29435: done with get_vars() 29946 1726882582.29444: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:20 Friday 20 September 2024 21:36:22 -0400 (0:00:00.016) 0:00:08.404 ****** 29946 1726882582.29517: entering _queue_task() for managed_node2/include_tasks 29946 1726882582.29720: worker is 1 (out of 1 available) 29946 1726882582.29733: exiting _queue_task() for managed_node2/include_tasks 29946 1726882582.29743: done queuing things up, now waiting for results queue to drain 29946 1726882582.29745: waiting for pending results... 29946 1726882582.30409: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' 29946 1726882582.30414: in run() - task 12673a56-9f93-95e7-9dfb-00000000000e 29946 1726882582.30417: variable 'ansible_search_path' from source: unknown 29946 1726882582.30420: calling self._execute() 29946 1726882582.30616: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.30619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.30621: variable 'omit' from source: magic vars 29946 1726882582.30936: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.31051: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.31055: _execute() done 29946 1726882582.31057: dumping result to json 29946 1726882582.31059: done dumping result, returning 29946 1726882582.31061: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_present.yml' [12673a56-9f93-95e7-9dfb-00000000000e] 29946 1726882582.31064: sending task result for task 12673a56-9f93-95e7-9dfb-00000000000e 29946 1726882582.31330: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000000e 29946 1726882582.31333: WORKER PROCESS EXITING 29946 1726882582.31398: no more pending results, returning what we have 29946 1726882582.31403: in VariableManager get_vars() 29946 1726882582.31444: Calling all_inventory to load vars for managed_node2 29946 1726882582.31447: Calling groups_inventory to load vars for managed_node2 29946 1726882582.31449: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.31461: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.31464: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.31466: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.31691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.31895: done with get_vars() 29946 1726882582.31903: variable 'ansible_search_path' from source: unknown 29946 1726882582.31915: we have included files to process 29946 1726882582.31916: generating all_blocks data 29946 1726882582.31918: done generating all_blocks data 29946 1726882582.31921: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29946 1726882582.31922: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29946 1726882582.31924: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 29946 1726882582.32068: in VariableManager get_vars() 29946 1726882582.32086: done with get_vars() 29946 1726882582.32197: done processing included file 29946 1726882582.32198: iterating over new_blocks loaded from include file 29946 1726882582.32199: in VariableManager get_vars() 29946 1726882582.32209: done with get_vars() 29946 1726882582.32210: filtering new block on tags 29946 1726882582.32222: done filtering new block on tags 29946 1726882582.32224: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed_node2 29946 1726882582.32227: extending task lists for all hosts with included blocks 29946 1726882582.33189: done extending task lists 29946 1726882582.33190: done processing included files 29946 1726882582.33191: results queue empty 29946 1726882582.33191: checking for any_errors_fatal 29946 1726882582.33195: done checking for any_errors_fatal 29946 1726882582.33196: checking for max_fail_percentage 29946 1726882582.33197: done checking for max_fail_percentage 29946 1726882582.33198: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.33198: done checking to see if all hosts have failed 29946 1726882582.33199: getting the remaining hosts for this loop 29946 1726882582.33199: done getting the remaining hosts for this loop 29946 1726882582.33201: getting the next task for host managed_node2 29946 1726882582.33203: done getting next task for host managed_node2 29946 1726882582.33205: ^ task is: TASK: Include the task 'get_interface_stat.yml' 29946 1726882582.33206: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.33208: getting variables 29946 1726882582.33208: in VariableManager get_vars() 29946 1726882582.33216: Calling all_inventory to load vars for managed_node2 29946 1726882582.33218: Calling groups_inventory to load vars for managed_node2 29946 1726882582.33220: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.33224: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.33225: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.33227: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.33315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.33431: done with get_vars() 29946 1726882582.33439: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:36:22 -0400 (0:00:00.039) 0:00:08.444 ****** 29946 1726882582.33483: entering _queue_task() for managed_node2/include_tasks 29946 1726882582.33683: worker is 1 (out of 1 available) 29946 1726882582.33700: exiting _queue_task() for managed_node2/include_tasks 29946 1726882582.33713: done queuing things up, now waiting for results queue to drain 29946 1726882582.33715: waiting for pending results... 29946 1726882582.33935: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 29946 1726882582.34202: in run() - task 12673a56-9f93-95e7-9dfb-0000000002ec 29946 1726882582.34205: variable 'ansible_search_path' from source: unknown 29946 1726882582.34208: variable 'ansible_search_path' from source: unknown 29946 1726882582.34210: calling self._execute() 29946 1726882582.34213: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.34215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.34217: variable 'omit' from source: magic vars 29946 1726882582.34679: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.34701: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.34733: _execute() done 29946 1726882582.34743: dumping result to json 29946 1726882582.34756: done dumping result, returning 29946 1726882582.34789: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-95e7-9dfb-0000000002ec] 29946 1726882582.34803: sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ec 29946 1726882582.34985: no more pending results, returning what we have 29946 1726882582.34990: in VariableManager get_vars() 29946 1726882582.35033: Calling all_inventory to load vars for managed_node2 29946 1726882582.35036: Calling groups_inventory to load vars for managed_node2 29946 1726882582.35039: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.35051: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.35054: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.35057: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.35468: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ec 29946 1726882582.35471: WORKER PROCESS EXITING 29946 1726882582.35491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.35696: done with get_vars() 29946 1726882582.35704: variable 'ansible_search_path' from source: unknown 29946 1726882582.35706: variable 'ansible_search_path' from source: unknown 29946 1726882582.35742: we have included files to process 29946 1726882582.35744: generating all_blocks data 29946 1726882582.35745: done generating all_blocks data 29946 1726882582.35747: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882582.35748: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882582.35750: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882582.35949: done processing included file 29946 1726882582.35951: iterating over new_blocks loaded from include file 29946 1726882582.35952: in VariableManager get_vars() 29946 1726882582.35965: done with get_vars() 29946 1726882582.35967: filtering new block on tags 29946 1726882582.35980: done filtering new block on tags 29946 1726882582.35982: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 29946 1726882582.35986: extending task lists for all hosts with included blocks 29946 1726882582.36079: done extending task lists 29946 1726882582.36080: done processing included files 29946 1726882582.36081: results queue empty 29946 1726882582.36082: checking for any_errors_fatal 29946 1726882582.36084: done checking for any_errors_fatal 29946 1726882582.36085: checking for max_fail_percentage 29946 1726882582.36086: done checking for max_fail_percentage 29946 1726882582.36087: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.36088: done checking to see if all hosts have failed 29946 1726882582.36088: getting the remaining hosts for this loop 29946 1726882582.36089: done getting the remaining hosts for this loop 29946 1726882582.36092: getting the next task for host managed_node2 29946 1726882582.36097: done getting next task for host managed_node2 29946 1726882582.36099: ^ task is: TASK: Get stat for interface {{ interface }} 29946 1726882582.36102: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.36104: getting variables 29946 1726882582.36105: in VariableManager get_vars() 29946 1726882582.36114: Calling all_inventory to load vars for managed_node2 29946 1726882582.36116: Calling groups_inventory to load vars for managed_node2 29946 1726882582.36118: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.36122: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.36125: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.36127: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.36349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.36583: done with get_vars() 29946 1726882582.36592: done getting variables 29946 1726882582.36873: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:22 -0400 (0:00:00.034) 0:00:08.478 ****** 29946 1726882582.36940: entering _queue_task() for managed_node2/stat 29946 1726882582.37164: worker is 1 (out of 1 available) 29946 1726882582.37176: exiting _queue_task() for managed_node2/stat 29946 1726882582.37187: done queuing things up, now waiting for results queue to drain 29946 1726882582.37189: waiting for pending results... 29946 1726882582.37404: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 29946 1726882582.37530: in run() - task 12673a56-9f93-95e7-9dfb-0000000003b5 29946 1726882582.37645: variable 'ansible_search_path' from source: unknown 29946 1726882582.37651: variable 'ansible_search_path' from source: unknown 29946 1726882582.37654: calling self._execute() 29946 1726882582.37714: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.37729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.37746: variable 'omit' from source: magic vars 29946 1726882582.38425: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.38428: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.38431: variable 'omit' from source: magic vars 29946 1726882582.38433: variable 'omit' from source: magic vars 29946 1726882582.38901: variable 'interface' from source: set_fact 29946 1726882582.38905: variable 'omit' from source: magic vars 29946 1726882582.38908: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882582.38946: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882582.39031: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882582.39056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.39114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.39146: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882582.39209: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.39219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.39428: Set connection var ansible_pipelining to False 29946 1726882582.39442: Set connection var ansible_shell_executable to /bin/sh 29946 1726882582.39699: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882582.39703: Set connection var ansible_timeout to 10 29946 1726882582.39705: Set connection var ansible_shell_type to sh 29946 1726882582.39708: Set connection var ansible_connection to ssh 29946 1726882582.39710: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.39713: variable 'ansible_connection' from source: unknown 29946 1726882582.39715: variable 'ansible_module_compression' from source: unknown 29946 1726882582.39717: variable 'ansible_shell_type' from source: unknown 29946 1726882582.39720: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.39722: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.39724: variable 'ansible_pipelining' from source: unknown 29946 1726882582.39726: variable 'ansible_timeout' from source: unknown 29946 1726882582.39728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.39961: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882582.40003: variable 'omit' from source: magic vars 29946 1726882582.40047: starting attempt loop 29946 1726882582.40056: running the handler 29946 1726882582.40077: _low_level_execute_command(): starting 29946 1726882582.40091: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882582.40762: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.40776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.40792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.40820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.40841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882582.40854: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882582.40911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.40960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882582.40977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.41002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.41120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.42853: stdout chunk (state=3): >>>/root <<< 29946 1726882582.42992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.43007: stdout chunk (state=3): >>><<< 29946 1726882582.43127: stderr chunk (state=3): >>><<< 29946 1726882582.43132: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.43134: _low_level_execute_command(): starting 29946 1726882582.43137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682 `" && echo ansible-tmp-1726882582.4304388-30395-227456275352682="` echo /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682 `" ) && sleep 0' 29946 1726882582.43649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.43662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.43676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.43701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.43717: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882582.43737: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882582.43810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.43850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882582.43865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.43886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.43979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.45865: stdout chunk (state=3): >>>ansible-tmp-1726882582.4304388-30395-227456275352682=/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682 <<< 29946 1726882582.45999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.46014: stdout chunk (state=3): >>><<< 29946 1726882582.46033: stderr chunk (state=3): >>><<< 29946 1726882582.46198: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.4304388-30395-227456275352682=/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.46202: variable 'ansible_module_compression' from source: unknown 29946 1726882582.46204: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29946 1726882582.46206: variable 'ansible_facts' from source: unknown 29946 1726882582.46319: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py 29946 1726882582.46462: Sending initial data 29946 1726882582.46471: Sent initial data (153 bytes) 29946 1726882582.47053: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.47063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.47075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.47092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.47184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.47211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.47463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.48935: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882582.49019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882582.49106: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp49hsb8oi /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py <<< 29946 1726882582.49110: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py" <<< 29946 1726882582.49195: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp49hsb8oi" to remote "/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py" <<< 29946 1726882582.50227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.50231: stdout chunk (state=3): >>><<< 29946 1726882582.50233: stderr chunk (state=3): >>><<< 29946 1726882582.50235: done transferring module to remote 29946 1726882582.50237: _low_level_execute_command(): starting 29946 1726882582.50239: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/ /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py && sleep 0' 29946 1726882582.50810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.50827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.50843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.50890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.50912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.50925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882582.51004: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.51026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.51046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.51134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.52903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.52913: stdout chunk (state=3): >>><<< 29946 1726882582.52923: stderr chunk (state=3): >>><<< 29946 1726882582.52943: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.52950: _low_level_execute_command(): starting 29946 1726882582.52959: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/AnsiballZ_stat.py && sleep 0' 29946 1726882582.53496: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.53510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.53520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.53565: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.53584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.53658: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.68819: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30821, "dev": 23, "nlink": 1, "atime": 1726882580.9009783, "mtime": 1726882580.9009783, "ctime": 1726882580.9009783, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29946 1726882582.70287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882582.70291: stdout chunk (state=3): >>><<< 29946 1726882582.70295: stderr chunk (state=3): >>><<< 29946 1726882582.70298: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 30821, "dev": 23, "nlink": 1, "atime": 1726882580.9009783, "mtime": 1726882580.9009783, "ctime": 1726882580.9009783, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882582.70300: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882582.70307: _low_level_execute_command(): starting 29946 1726882582.70309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.4304388-30395-227456275352682/ > /dev/null 2>&1 && sleep 0' 29946 1726882582.70862: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882582.70878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.70896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.70916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.70939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882582.70951: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882582.71043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882582.71071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.71166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.72969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.73026: stderr chunk (state=3): >>><<< 29946 1726882582.73039: stdout chunk (state=3): >>><<< 29946 1726882582.73058: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.73098: handler run complete 29946 1726882582.73124: attempt loop complete, returning result 29946 1726882582.73130: _execute() done 29946 1726882582.73138: dumping result to json 29946 1726882582.73152: done dumping result, returning 29946 1726882582.73164: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000003b5] 29946 1726882582.73252: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003b5 29946 1726882582.73329: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003b5 29946 1726882582.73332: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "atime": 1726882580.9009783, "block_size": 4096, "blocks": 0, "ctime": 1726882580.9009783, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 30821, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1726882580.9009783, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 29946 1726882582.73427: no more pending results, returning what we have 29946 1726882582.73432: results queue empty 29946 1726882582.73433: checking for any_errors_fatal 29946 1726882582.73434: done checking for any_errors_fatal 29946 1726882582.73435: checking for max_fail_percentage 29946 1726882582.73437: done checking for max_fail_percentage 29946 1726882582.73438: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.73438: done checking to see if all hosts have failed 29946 1726882582.73439: getting the remaining hosts for this loop 29946 1726882582.73442: done getting the remaining hosts for this loop 29946 1726882582.73446: getting the next task for host managed_node2 29946 1726882582.73454: done getting next task for host managed_node2 29946 1726882582.73456: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 29946 1726882582.73459: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.73464: getting variables 29946 1726882582.73465: in VariableManager get_vars() 29946 1726882582.73738: Calling all_inventory to load vars for managed_node2 29946 1726882582.73741: Calling groups_inventory to load vars for managed_node2 29946 1726882582.73743: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.73755: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.73758: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.73761: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.74038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.74236: done with get_vars() 29946 1726882582.74247: done getting variables 29946 1726882582.74384: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 29946 1726882582.74517: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:36:22 -0400 (0:00:00.376) 0:00:08.855 ****** 29946 1726882582.74547: entering _queue_task() for managed_node2/assert 29946 1726882582.74553: Creating lock for assert 29946 1726882582.74958: worker is 1 (out of 1 available) 29946 1726882582.74969: exiting _queue_task() for managed_node2/assert 29946 1726882582.74978: done queuing things up, now waiting for results queue to drain 29946 1726882582.74980: waiting for pending results... 29946 1726882582.75217: running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' 29946 1726882582.75461: in run() - task 12673a56-9f93-95e7-9dfb-0000000002ed 29946 1726882582.75466: variable 'ansible_search_path' from source: unknown 29946 1726882582.75469: variable 'ansible_search_path' from source: unknown 29946 1726882582.75702: calling self._execute() 29946 1726882582.75706: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.75811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.75903: variable 'omit' from source: magic vars 29946 1726882582.76655: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.76659: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.76756: variable 'omit' from source: magic vars 29946 1726882582.76878: variable 'omit' from source: magic vars 29946 1726882582.77008: variable 'interface' from source: set_fact 29946 1726882582.77035: variable 'omit' from source: magic vars 29946 1726882582.77091: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882582.77135: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882582.77161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882582.77196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.77215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.77248: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882582.77258: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.77267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.77384: Set connection var ansible_pipelining to False 29946 1726882582.77409: Set connection var ansible_shell_executable to /bin/sh 29946 1726882582.77499: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882582.77507: Set connection var ansible_timeout to 10 29946 1726882582.77512: Set connection var ansible_shell_type to sh 29946 1726882582.77515: Set connection var ansible_connection to ssh 29946 1726882582.77517: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.77520: variable 'ansible_connection' from source: unknown 29946 1726882582.77522: variable 'ansible_module_compression' from source: unknown 29946 1726882582.77524: variable 'ansible_shell_type' from source: unknown 29946 1726882582.77526: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.77528: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.77530: variable 'ansible_pipelining' from source: unknown 29946 1726882582.77533: variable 'ansible_timeout' from source: unknown 29946 1726882582.77535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.77660: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882582.77674: variable 'omit' from source: magic vars 29946 1726882582.77688: starting attempt loop 29946 1726882582.77698: running the handler 29946 1726882582.77849: variable 'interface_stat' from source: set_fact 29946 1726882582.77872: Evaluated conditional (interface_stat.stat.exists): True 29946 1726882582.77881: handler run complete 29946 1726882582.77904: attempt loop complete, returning result 29946 1726882582.77946: _execute() done 29946 1726882582.77949: dumping result to json 29946 1726882582.77952: done dumping result, returning 29946 1726882582.77954: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is present - 'ethtest0' [12673a56-9f93-95e7-9dfb-0000000002ed] 29946 1726882582.77956: sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ed ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882582.78150: no more pending results, returning what we have 29946 1726882582.78154: results queue empty 29946 1726882582.78156: checking for any_errors_fatal 29946 1726882582.78165: done checking for any_errors_fatal 29946 1726882582.78165: checking for max_fail_percentage 29946 1726882582.78167: done checking for max_fail_percentage 29946 1726882582.78167: checking to see if all hosts have failed and the running result is not ok 29946 1726882582.78168: done checking to see if all hosts have failed 29946 1726882582.78169: getting the remaining hosts for this loop 29946 1726882582.78170: done getting the remaining hosts for this loop 29946 1726882582.78173: getting the next task for host managed_node2 29946 1726882582.78180: done getting next task for host managed_node2 29946 1726882582.78183: ^ task is: TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 29946 1726882582.78185: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882582.78190: getting variables 29946 1726882582.78192: in VariableManager get_vars() 29946 1726882582.78230: Calling all_inventory to load vars for managed_node2 29946 1726882582.78233: Calling groups_inventory to load vars for managed_node2 29946 1726882582.78235: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882582.78243: Calling all_plugins_play to load vars for managed_node2 29946 1726882582.78245: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882582.78248: Calling groups_plugins_play to load vars for managed_node2 29946 1726882582.78428: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000002ed 29946 1726882582.78431: WORKER PROCESS EXITING 29946 1726882582.78441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882582.78569: done with get_vars() 29946 1726882582.78577: done getting variables TASK [Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:23 Friday 20 September 2024 21:36:22 -0400 (0:00:00.040) 0:00:08.895 ****** 29946 1726882582.78642: entering _queue_task() for managed_node2/lineinfile 29946 1726882582.78644: Creating lock for lineinfile 29946 1726882582.78899: worker is 1 (out of 1 available) 29946 1726882582.78910: exiting _queue_task() for managed_node2/lineinfile 29946 1726882582.78922: done queuing things up, now waiting for results queue to drain 29946 1726882582.78924: waiting for pending results... 29946 1726882582.79508: running TaskExecutor() for managed_node2/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table 29946 1726882582.79513: in run() - task 12673a56-9f93-95e7-9dfb-00000000000f 29946 1726882582.79516: variable 'ansible_search_path' from source: unknown 29946 1726882582.79518: calling self._execute() 29946 1726882582.79595: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.79612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.79634: variable 'omit' from source: magic vars 29946 1726882582.80041: variable 'ansible_distribution_major_version' from source: facts 29946 1726882582.80044: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882582.80047: variable 'omit' from source: magic vars 29946 1726882582.80066: variable 'omit' from source: magic vars 29946 1726882582.80112: variable 'omit' from source: magic vars 29946 1726882582.80259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882582.80263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882582.80265: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882582.80268: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.80273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882582.80311: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882582.80321: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.80330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.80444: Set connection var ansible_pipelining to False 29946 1726882582.80457: Set connection var ansible_shell_executable to /bin/sh 29946 1726882582.80468: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882582.80484: Set connection var ansible_timeout to 10 29946 1726882582.80499: Set connection var ansible_shell_type to sh 29946 1726882582.80506: Set connection var ansible_connection to ssh 29946 1726882582.80528: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.80534: variable 'ansible_connection' from source: unknown 29946 1726882582.80539: variable 'ansible_module_compression' from source: unknown 29946 1726882582.80544: variable 'ansible_shell_type' from source: unknown 29946 1726882582.80550: variable 'ansible_shell_executable' from source: unknown 29946 1726882582.80555: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882582.80561: variable 'ansible_pipelining' from source: unknown 29946 1726882582.80566: variable 'ansible_timeout' from source: unknown 29946 1726882582.80585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882582.80784: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882582.80898: variable 'omit' from source: magic vars 29946 1726882582.80904: starting attempt loop 29946 1726882582.80906: running the handler 29946 1726882582.80909: _low_level_execute_command(): starting 29946 1726882582.80911: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882582.81906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882582.81910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.81914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882582.81917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.81955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882582.81990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.82048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.83650: stdout chunk (state=3): >>>/root <<< 29946 1726882582.83820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.83823: stdout chunk (state=3): >>><<< 29946 1726882582.83826: stderr chunk (state=3): >>><<< 29946 1726882582.83851: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.83875: _low_level_execute_command(): starting 29946 1726882582.83959: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629 `" && echo ansible-tmp-1726882582.8386168-30428-150970947525629="` echo /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629 `" ) && sleep 0' 29946 1726882582.84806: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.84809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882582.84814: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.84838: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882582.84850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882582.84903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882582.84938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882582.84998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882582.86866: stdout chunk (state=3): >>>ansible-tmp-1726882582.8386168-30428-150970947525629=/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629 <<< 29946 1726882582.87022: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882582.87025: stdout chunk (state=3): >>><<< 29946 1726882582.87027: stderr chunk (state=3): >>><<< 29946 1726882582.87198: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882582.8386168-30428-150970947525629=/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882582.87201: variable 'ansible_module_compression' from source: unknown 29946 1726882582.87203: ANSIBALLZ: Using lock for lineinfile 29946 1726882582.87204: ANSIBALLZ: Acquiring lock 29946 1726882582.87206: ANSIBALLZ: Lock acquired: 140626578892144 29946 1726882582.87207: ANSIBALLZ: Creating module 29946 1726882583.00721: ANSIBALLZ: Writing module into payload 29946 1726882583.00849: ANSIBALLZ: Writing module 29946 1726882583.00882: ANSIBALLZ: Renaming module 29946 1726882583.00897: ANSIBALLZ: Done creating module 29946 1726882583.00920: variable 'ansible_facts' from source: unknown 29946 1726882583.01011: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py 29946 1726882583.01168: Sending initial data 29946 1726882583.01177: Sent initial data (159 bytes) 29946 1726882583.01786: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882583.01803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.01838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.01856: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882583.01939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.01943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.01959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.01976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.02074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.03718: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882583.03778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882583.03856: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpxtmdzgfc /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py <<< 29946 1726882583.03859: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py" <<< 29946 1726882583.03919: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpxtmdzgfc" to remote "/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py" <<< 29946 1726882583.04771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.04775: stdout chunk (state=3): >>><<< 29946 1726882583.04777: stderr chunk (state=3): >>><<< 29946 1726882583.04791: done transferring module to remote 29946 1726882583.04803: _low_level_execute_command(): starting 29946 1726882583.04808: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/ /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py && sleep 0' 29946 1726882583.05236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.05240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882583.05244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882583.05246: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.05252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.05305: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.05309: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.05365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.07204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.07207: stdout chunk (state=3): >>><<< 29946 1726882583.07209: stderr chunk (state=3): >>><<< 29946 1726882583.07212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882583.07215: _low_level_execute_command(): starting 29946 1726882583.07218: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/AnsiballZ_lineinfile.py && sleep 0' 29946 1726882583.07735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882583.07752: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.07755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.07778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.07781: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.07783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.07840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.07847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.07915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.23806: stdout chunk (state=3): >>> <<< 29946 1726882583.23811: stdout chunk (state=3): >>>{"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 29946 1726882583.25138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882583.25171: stderr chunk (state=3): >>><<< 29946 1726882583.25174: stdout chunk (state=3): >>><<< 29946 1726882583.25197: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "msg": "line added", "backup": "", "diff": [{"before": "", "after": "", "before_header": "/etc/iproute2/rt_tables.d/table.conf (content)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (content)"}, {"before_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)", "after_header": "/etc/iproute2/rt_tables.d/table.conf (file attributes)"}], "invocation": {"module_args": {"path": "/etc/iproute2/rt_tables.d/table.conf", "line": "200 custom", "mode": "0644", "create": true, "state": "present", "backrefs": false, "backup": false, "firstmatch": false, "unsafe_writes": false, "regexp": null, "search_string": null, "insertafter": null, "insertbefore": null, "validate": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882583.25226: done with _execute_module (lineinfile, {'path': '/etc/iproute2/rt_tables.d/table.conf', 'line': '200 custom', 'mode': '0644', 'create': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'lineinfile', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882583.25236: _low_level_execute_command(): starting 29946 1726882583.25239: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882582.8386168-30428-150970947525629/ > /dev/null 2>&1 && sleep 0' 29946 1726882583.25667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.25670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882583.25707: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882583.25710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.25712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.25714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882583.25716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.25771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.25776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.25779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.25843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.27654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.27680: stderr chunk (state=3): >>><<< 29946 1726882583.27683: stdout chunk (state=3): >>><<< 29946 1726882583.27701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882583.27704: handler run complete 29946 1726882583.27721: attempt loop complete, returning result 29946 1726882583.27724: _execute() done 29946 1726882583.27727: dumping result to json 29946 1726882583.27732: done dumping result, returning 29946 1726882583.27739: done running TaskExecutor() for managed_node2/TASK: Create a dedicated test file in `/etc/iproute2/rt_tables.d/` and add a new routing table [12673a56-9f93-95e7-9dfb-00000000000f] 29946 1726882583.27742: sending task result for task 12673a56-9f93-95e7-9dfb-00000000000f 29946 1726882583.27844: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000000f 29946 1726882583.27847: WORKER PROCESS EXITING changed: [managed_node2] => { "backup": "", "changed": true } MSG: line added 29946 1726882583.27937: no more pending results, returning what we have 29946 1726882583.27941: results queue empty 29946 1726882583.27942: checking for any_errors_fatal 29946 1726882583.27947: done checking for any_errors_fatal 29946 1726882583.27948: checking for max_fail_percentage 29946 1726882583.27949: done checking for max_fail_percentage 29946 1726882583.27950: checking to see if all hosts have failed and the running result is not ok 29946 1726882583.27951: done checking to see if all hosts have failed 29946 1726882583.27951: getting the remaining hosts for this loop 29946 1726882583.27953: done getting the remaining hosts for this loop 29946 1726882583.27956: getting the next task for host managed_node2 29946 1726882583.27961: done getting next task for host managed_node2 29946 1726882583.27966: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882583.27969: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882583.27985: getting variables 29946 1726882583.27986: in VariableManager get_vars() 29946 1726882583.28024: Calling all_inventory to load vars for managed_node2 29946 1726882583.28026: Calling groups_inventory to load vars for managed_node2 29946 1726882583.28028: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.28037: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.28040: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.28042: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.28189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.28352: done with get_vars() 29946 1726882583.28360: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:23 -0400 (0:00:00.497) 0:00:09.393 ****** 29946 1726882583.28432: entering _queue_task() for managed_node2/include_tasks 29946 1726882583.28622: worker is 1 (out of 1 available) 29946 1726882583.28634: exiting _queue_task() for managed_node2/include_tasks 29946 1726882583.28644: done queuing things up, now waiting for results queue to drain 29946 1726882583.28646: waiting for pending results... 29946 1726882583.28809: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882583.28886: in run() - task 12673a56-9f93-95e7-9dfb-000000000017 29946 1726882583.28902: variable 'ansible_search_path' from source: unknown 29946 1726882583.28907: variable 'ansible_search_path' from source: unknown 29946 1726882583.28931: calling self._execute() 29946 1726882583.28991: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.29000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.29008: variable 'omit' from source: magic vars 29946 1726882583.29258: variable 'ansible_distribution_major_version' from source: facts 29946 1726882583.29267: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882583.29273: _execute() done 29946 1726882583.29276: dumping result to json 29946 1726882583.29278: done dumping result, returning 29946 1726882583.29285: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-95e7-9dfb-000000000017] 29946 1726882583.29292: sending task result for task 12673a56-9f93-95e7-9dfb-000000000017 29946 1726882583.29374: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000017 29946 1726882583.29377: WORKER PROCESS EXITING 29946 1726882583.29444: no more pending results, returning what we have 29946 1726882583.29448: in VariableManager get_vars() 29946 1726882583.29481: Calling all_inventory to load vars for managed_node2 29946 1726882583.29484: Calling groups_inventory to load vars for managed_node2 29946 1726882583.29486: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.29495: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.29498: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.29501: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.29616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.29739: done with get_vars() 29946 1726882583.29745: variable 'ansible_search_path' from source: unknown 29946 1726882583.29745: variable 'ansible_search_path' from source: unknown 29946 1726882583.29770: we have included files to process 29946 1726882583.29771: generating all_blocks data 29946 1726882583.29772: done generating all_blocks data 29946 1726882583.29776: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882583.29777: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882583.29778: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882583.30244: done processing included file 29946 1726882583.30246: iterating over new_blocks loaded from include file 29946 1726882583.30247: in VariableManager get_vars() 29946 1726882583.30262: done with get_vars() 29946 1726882583.30263: filtering new block on tags 29946 1726882583.30273: done filtering new block on tags 29946 1726882583.30275: in VariableManager get_vars() 29946 1726882583.30288: done with get_vars() 29946 1726882583.30289: filtering new block on tags 29946 1726882583.30302: done filtering new block on tags 29946 1726882583.30304: in VariableManager get_vars() 29946 1726882583.30316: done with get_vars() 29946 1726882583.30317: filtering new block on tags 29946 1726882583.30327: done filtering new block on tags 29946 1726882583.30328: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 29946 1726882583.30331: extending task lists for all hosts with included blocks 29946 1726882583.30761: done extending task lists 29946 1726882583.30762: done processing included files 29946 1726882583.30762: results queue empty 29946 1726882583.30763: checking for any_errors_fatal 29946 1726882583.30766: done checking for any_errors_fatal 29946 1726882583.30767: checking for max_fail_percentage 29946 1726882583.30767: done checking for max_fail_percentage 29946 1726882583.30768: checking to see if all hosts have failed and the running result is not ok 29946 1726882583.30768: done checking to see if all hosts have failed 29946 1726882583.30768: getting the remaining hosts for this loop 29946 1726882583.30769: done getting the remaining hosts for this loop 29946 1726882583.30771: getting the next task for host managed_node2 29946 1726882583.30773: done getting next task for host managed_node2 29946 1726882583.30775: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882583.30777: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882583.30782: getting variables 29946 1726882583.30783: in VariableManager get_vars() 29946 1726882583.30792: Calling all_inventory to load vars for managed_node2 29946 1726882583.30796: Calling groups_inventory to load vars for managed_node2 29946 1726882583.30797: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.30800: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.30802: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.30805: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.30889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.31029: done with get_vars() 29946 1726882583.31035: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:23 -0400 (0:00:00.026) 0:00:09.420 ****** 29946 1726882583.31081: entering _queue_task() for managed_node2/setup 29946 1726882583.31257: worker is 1 (out of 1 available) 29946 1726882583.31270: exiting _queue_task() for managed_node2/setup 29946 1726882583.31282: done queuing things up, now waiting for results queue to drain 29946 1726882583.31284: waiting for pending results... 29946 1726882583.31432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882583.31522: in run() - task 12673a56-9f93-95e7-9dfb-0000000003d0 29946 1726882583.31533: variable 'ansible_search_path' from source: unknown 29946 1726882583.31536: variable 'ansible_search_path' from source: unknown 29946 1726882583.31560: calling self._execute() 29946 1726882583.31625: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.31629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.31632: variable 'omit' from source: magic vars 29946 1726882583.31874: variable 'ansible_distribution_major_version' from source: facts 29946 1726882583.31883: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882583.32026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882583.33616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882583.33656: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882583.33696: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882583.33722: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882583.33742: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882583.33798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882583.33824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882583.33841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882583.33866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882583.33877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882583.33922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882583.33938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882583.33955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882583.33979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882583.33991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882583.34088: variable '__network_required_facts' from source: role '' defaults 29946 1726882583.34092: variable 'ansible_facts' from source: unknown 29946 1726882583.34153: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29946 1726882583.34157: when evaluation is False, skipping this task 29946 1726882583.34160: _execute() done 29946 1726882583.34162: dumping result to json 29946 1726882583.34164: done dumping result, returning 29946 1726882583.34168: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-95e7-9dfb-0000000003d0] 29946 1726882583.34173: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d0 29946 1726882583.34245: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d0 29946 1726882583.34248: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882583.34286: no more pending results, returning what we have 29946 1726882583.34289: results queue empty 29946 1726882583.34290: checking for any_errors_fatal 29946 1726882583.34291: done checking for any_errors_fatal 29946 1726882583.34292: checking for max_fail_percentage 29946 1726882583.34295: done checking for max_fail_percentage 29946 1726882583.34296: checking to see if all hosts have failed and the running result is not ok 29946 1726882583.34297: done checking to see if all hosts have failed 29946 1726882583.34297: getting the remaining hosts for this loop 29946 1726882583.34299: done getting the remaining hosts for this loop 29946 1726882583.34302: getting the next task for host managed_node2 29946 1726882583.34310: done getting next task for host managed_node2 29946 1726882583.34313: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882583.34317: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882583.34329: getting variables 29946 1726882583.34330: in VariableManager get_vars() 29946 1726882583.34362: Calling all_inventory to load vars for managed_node2 29946 1726882583.34364: Calling groups_inventory to load vars for managed_node2 29946 1726882583.34366: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.34374: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.34377: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.34379: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.34515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.34642: done with get_vars() 29946 1726882583.34649: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:23 -0400 (0:00:00.036) 0:00:09.456 ****** 29946 1726882583.34714: entering _queue_task() for managed_node2/stat 29946 1726882583.34890: worker is 1 (out of 1 available) 29946 1726882583.34906: exiting _queue_task() for managed_node2/stat 29946 1726882583.34916: done queuing things up, now waiting for results queue to drain 29946 1726882583.34918: waiting for pending results... 29946 1726882583.35063: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882583.35157: in run() - task 12673a56-9f93-95e7-9dfb-0000000003d2 29946 1726882583.35172: variable 'ansible_search_path' from source: unknown 29946 1726882583.35175: variable 'ansible_search_path' from source: unknown 29946 1726882583.35204: calling self._execute() 29946 1726882583.35260: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.35268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.35278: variable 'omit' from source: magic vars 29946 1726882583.35526: variable 'ansible_distribution_major_version' from source: facts 29946 1726882583.35536: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882583.35644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882583.36078: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882583.36109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882583.36136: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882583.36162: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882583.36221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882583.36240: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882583.36261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882583.36278: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882583.36335: variable '__network_is_ostree' from source: set_fact 29946 1726882583.36340: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882583.36343: when evaluation is False, skipping this task 29946 1726882583.36346: _execute() done 29946 1726882583.36348: dumping result to json 29946 1726882583.36350: done dumping result, returning 29946 1726882583.36363: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-95e7-9dfb-0000000003d2] 29946 1726882583.36366: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d2 29946 1726882583.36440: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d2 29946 1726882583.36443: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882583.36511: no more pending results, returning what we have 29946 1726882583.36514: results queue empty 29946 1726882583.36515: checking for any_errors_fatal 29946 1726882583.36519: done checking for any_errors_fatal 29946 1726882583.36520: checking for max_fail_percentage 29946 1726882583.36521: done checking for max_fail_percentage 29946 1726882583.36522: checking to see if all hosts have failed and the running result is not ok 29946 1726882583.36523: done checking to see if all hosts have failed 29946 1726882583.36524: getting the remaining hosts for this loop 29946 1726882583.36525: done getting the remaining hosts for this loop 29946 1726882583.36528: getting the next task for host managed_node2 29946 1726882583.36533: done getting next task for host managed_node2 29946 1726882583.36536: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882583.36540: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882583.36551: getting variables 29946 1726882583.36552: in VariableManager get_vars() 29946 1726882583.36589: Calling all_inventory to load vars for managed_node2 29946 1726882583.36592: Calling groups_inventory to load vars for managed_node2 29946 1726882583.36595: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.36602: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.36604: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.36606: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.36902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.37025: done with get_vars() 29946 1726882583.37031: done getting variables 29946 1726882583.37066: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:23 -0400 (0:00:00.023) 0:00:09.480 ****** 29946 1726882583.37090: entering _queue_task() for managed_node2/set_fact 29946 1726882583.37271: worker is 1 (out of 1 available) 29946 1726882583.37285: exiting _queue_task() for managed_node2/set_fact 29946 1726882583.37300: done queuing things up, now waiting for results queue to drain 29946 1726882583.37302: waiting for pending results... 29946 1726882583.37449: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882583.37544: in run() - task 12673a56-9f93-95e7-9dfb-0000000003d3 29946 1726882583.37555: variable 'ansible_search_path' from source: unknown 29946 1726882583.37560: variable 'ansible_search_path' from source: unknown 29946 1726882583.37588: calling self._execute() 29946 1726882583.37644: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.37650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.37659: variable 'omit' from source: magic vars 29946 1726882583.37909: variable 'ansible_distribution_major_version' from source: facts 29946 1726882583.37918: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882583.38025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882583.38204: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882583.38236: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882583.38260: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882583.38290: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882583.38345: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882583.38362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882583.38379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882583.38400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882583.38461: variable '__network_is_ostree' from source: set_fact 29946 1726882583.38466: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882583.38469: when evaluation is False, skipping this task 29946 1726882583.38471: _execute() done 29946 1726882583.38474: dumping result to json 29946 1726882583.38476: done dumping result, returning 29946 1726882583.38484: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-95e7-9dfb-0000000003d3] 29946 1726882583.38490: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d3 29946 1726882583.38565: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d3 29946 1726882583.38568: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882583.38639: no more pending results, returning what we have 29946 1726882583.38641: results queue empty 29946 1726882583.38642: checking for any_errors_fatal 29946 1726882583.38646: done checking for any_errors_fatal 29946 1726882583.38647: checking for max_fail_percentage 29946 1726882583.38649: done checking for max_fail_percentage 29946 1726882583.38649: checking to see if all hosts have failed and the running result is not ok 29946 1726882583.38650: done checking to see if all hosts have failed 29946 1726882583.38651: getting the remaining hosts for this loop 29946 1726882583.38652: done getting the remaining hosts for this loop 29946 1726882583.38655: getting the next task for host managed_node2 29946 1726882583.38661: done getting next task for host managed_node2 29946 1726882583.38664: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882583.38667: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882583.38680: getting variables 29946 1726882583.38681: in VariableManager get_vars() 29946 1726882583.38709: Calling all_inventory to load vars for managed_node2 29946 1726882583.38711: Calling groups_inventory to load vars for managed_node2 29946 1726882583.38712: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882583.38718: Calling all_plugins_play to load vars for managed_node2 29946 1726882583.38719: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882583.38721: Calling groups_plugins_play to load vars for managed_node2 29946 1726882583.38833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882583.38967: done with get_vars() 29946 1726882583.38974: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:23 -0400 (0:00:00.019) 0:00:09.499 ****** 29946 1726882583.39038: entering _queue_task() for managed_node2/service_facts 29946 1726882583.39039: Creating lock for service_facts 29946 1726882583.39218: worker is 1 (out of 1 available) 29946 1726882583.39230: exiting _queue_task() for managed_node2/service_facts 29946 1726882583.39240: done queuing things up, now waiting for results queue to drain 29946 1726882583.39242: waiting for pending results... 29946 1726882583.39488: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882583.39496: in run() - task 12673a56-9f93-95e7-9dfb-0000000003d5 29946 1726882583.39499: variable 'ansible_search_path' from source: unknown 29946 1726882583.39502: variable 'ansible_search_path' from source: unknown 29946 1726882583.39505: calling self._execute() 29946 1726882583.39551: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.39556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.39564: variable 'omit' from source: magic vars 29946 1726882583.39859: variable 'ansible_distribution_major_version' from source: facts 29946 1726882583.39868: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882583.39873: variable 'omit' from source: magic vars 29946 1726882583.39923: variable 'omit' from source: magic vars 29946 1726882583.39945: variable 'omit' from source: magic vars 29946 1726882583.39973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882583.39999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882583.40019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882583.40030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882583.40039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882583.40060: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882583.40064: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.40066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.40137: Set connection var ansible_pipelining to False 29946 1726882583.40140: Set connection var ansible_shell_executable to /bin/sh 29946 1726882583.40146: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882583.40151: Set connection var ansible_timeout to 10 29946 1726882583.40157: Set connection var ansible_shell_type to sh 29946 1726882583.40160: Set connection var ansible_connection to ssh 29946 1726882583.40175: variable 'ansible_shell_executable' from source: unknown 29946 1726882583.40178: variable 'ansible_connection' from source: unknown 29946 1726882583.40180: variable 'ansible_module_compression' from source: unknown 29946 1726882583.40183: variable 'ansible_shell_type' from source: unknown 29946 1726882583.40188: variable 'ansible_shell_executable' from source: unknown 29946 1726882583.40191: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882583.40195: variable 'ansible_pipelining' from source: unknown 29946 1726882583.40198: variable 'ansible_timeout' from source: unknown 29946 1726882583.40200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882583.40325: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882583.40332: variable 'omit' from source: magic vars 29946 1726882583.40344: starting attempt loop 29946 1726882583.40347: running the handler 29946 1726882583.40352: _low_level_execute_command(): starting 29946 1726882583.40359: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882583.40854: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.40857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.40860: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.40862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.40919: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.40922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.40924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.41000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.42650: stdout chunk (state=3): >>>/root <<< 29946 1726882583.42802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.42805: stdout chunk (state=3): >>><<< 29946 1726882583.42810: stderr chunk (state=3): >>><<< 29946 1726882583.42838: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882583.42858: _low_level_execute_command(): starting 29946 1726882583.42934: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381 `" && echo ansible-tmp-1726882583.4284496-30452-76319966122381="` echo /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381 `" ) && sleep 0' 29946 1726882583.43485: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882583.43508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.43619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.43645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.43669: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.43769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.45627: stdout chunk (state=3): >>>ansible-tmp-1726882583.4284496-30452-76319966122381=/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381 <<< 29946 1726882583.45735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.45757: stderr chunk (state=3): >>><<< 29946 1726882583.45760: stdout chunk (state=3): >>><<< 29946 1726882583.45773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882583.4284496-30452-76319966122381=/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882583.45810: variable 'ansible_module_compression' from source: unknown 29946 1726882583.45843: ANSIBALLZ: Using lock for service_facts 29946 1726882583.45846: ANSIBALLZ: Acquiring lock 29946 1726882583.45848: ANSIBALLZ: Lock acquired: 140626577265856 29946 1726882583.45854: ANSIBALLZ: Creating module 29946 1726882583.53873: ANSIBALLZ: Writing module into payload 29946 1726882583.53938: ANSIBALLZ: Writing module 29946 1726882583.53955: ANSIBALLZ: Renaming module 29946 1726882583.53961: ANSIBALLZ: Done creating module 29946 1726882583.53976: variable 'ansible_facts' from source: unknown 29946 1726882583.54027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py 29946 1726882583.54132: Sending initial data 29946 1726882583.54135: Sent initial data (161 bytes) 29946 1726882583.54588: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.54591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882583.54595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.54598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.54600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882583.54602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.54655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.54662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.54665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.54723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.56307: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882583.56310: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882583.56362: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882583.56429: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpa84w2ws4 /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py <<< 29946 1726882583.56435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py" <<< 29946 1726882583.56492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpa84w2ws4" to remote "/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py" <<< 29946 1726882583.57198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.57327: stderr chunk (state=3): >>><<< 29946 1726882583.57331: stdout chunk (state=3): >>><<< 29946 1726882583.57345: done transferring module to remote 29946 1726882583.57359: _low_level_execute_command(): starting 29946 1726882583.57369: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/ /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py && sleep 0' 29946 1726882583.57966: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882583.57979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.58009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.58109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.58135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882583.58150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.58244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882583.59953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882583.59972: stderr chunk (state=3): >>><<< 29946 1726882583.59975: stdout chunk (state=3): >>><<< 29946 1726882583.59991: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882583.60024: _low_level_execute_command(): starting 29946 1726882583.60027: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/AnsiballZ_service_facts.py && sleep 0' 29946 1726882583.60560: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882583.60563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882583.60565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882583.60625: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882583.60628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882583.60703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.13368: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 29946 1726882585.13385: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 29946 1726882585.13424: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 29946 1726882585.13445: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": <<< 29946 1726882585.13451: stdout chunk (state=3): >>>"static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29946 1726882585.14882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882585.14915: stderr chunk (state=3): >>><<< 29946 1726882585.14919: stdout chunk (state=3): >>><<< 29946 1726882585.14938: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882585.15326: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882585.15335: _low_level_execute_command(): starting 29946 1726882585.15338: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882583.4284496-30452-76319966122381/ > /dev/null 2>&1 && sleep 0' 29946 1726882585.15776: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882585.15779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882585.15781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.15783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882585.15785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882585.15787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.15836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.15839: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.15912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.17732: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.17758: stderr chunk (state=3): >>><<< 29946 1726882585.17761: stdout chunk (state=3): >>><<< 29946 1726882585.17771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882585.17776: handler run complete 29946 1726882585.17888: variable 'ansible_facts' from source: unknown 29946 1726882585.17985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882585.18249: variable 'ansible_facts' from source: unknown 29946 1726882585.18335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882585.18449: attempt loop complete, returning result 29946 1726882585.18453: _execute() done 29946 1726882585.18457: dumping result to json 29946 1726882585.18497: done dumping result, returning 29946 1726882585.18505: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-95e7-9dfb-0000000003d5] 29946 1726882585.18507: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d5 ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882585.19085: no more pending results, returning what we have 29946 1726882585.19088: results queue empty 29946 1726882585.19088: checking for any_errors_fatal 29946 1726882585.19095: done checking for any_errors_fatal 29946 1726882585.19095: checking for max_fail_percentage 29946 1726882585.19097: done checking for max_fail_percentage 29946 1726882585.19097: checking to see if all hosts have failed and the running result is not ok 29946 1726882585.19098: done checking to see if all hosts have failed 29946 1726882585.19099: getting the remaining hosts for this loop 29946 1726882585.19100: done getting the remaining hosts for this loop 29946 1726882585.19104: getting the next task for host managed_node2 29946 1726882585.19110: done getting next task for host managed_node2 29946 1726882585.19113: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882585.19116: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882585.19125: getting variables 29946 1726882585.19126: in VariableManager get_vars() 29946 1726882585.19155: Calling all_inventory to load vars for managed_node2 29946 1726882585.19157: Calling groups_inventory to load vars for managed_node2 29946 1726882585.19159: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882585.19167: Calling all_plugins_play to load vars for managed_node2 29946 1726882585.19169: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882585.19171: Calling groups_plugins_play to load vars for managed_node2 29946 1726882585.19440: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d5 29946 1726882585.19444: WORKER PROCESS EXITING 29946 1726882585.19453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882585.19741: done with get_vars() 29946 1726882585.19750: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:25 -0400 (0:00:01.807) 0:00:11.307 ****** 29946 1726882585.19820: entering _queue_task() for managed_node2/package_facts 29946 1726882585.19821: Creating lock for package_facts 29946 1726882585.20023: worker is 1 (out of 1 available) 29946 1726882585.20036: exiting _queue_task() for managed_node2/package_facts 29946 1726882585.20048: done queuing things up, now waiting for results queue to drain 29946 1726882585.20049: waiting for pending results... 29946 1726882585.20216: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882585.20309: in run() - task 12673a56-9f93-95e7-9dfb-0000000003d6 29946 1726882585.20320: variable 'ansible_search_path' from source: unknown 29946 1726882585.20323: variable 'ansible_search_path' from source: unknown 29946 1726882585.20350: calling self._execute() 29946 1726882585.20416: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882585.20422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882585.20430: variable 'omit' from source: magic vars 29946 1726882585.20689: variable 'ansible_distribution_major_version' from source: facts 29946 1726882585.20704: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882585.20707: variable 'omit' from source: magic vars 29946 1726882585.20756: variable 'omit' from source: magic vars 29946 1726882585.20779: variable 'omit' from source: magic vars 29946 1726882585.20815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882585.20843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882585.20858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882585.20871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882585.20880: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882585.20907: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882585.20911: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882585.20914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882585.20981: Set connection var ansible_pipelining to False 29946 1726882585.20985: Set connection var ansible_shell_executable to /bin/sh 29946 1726882585.20995: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882585.21000: Set connection var ansible_timeout to 10 29946 1726882585.21007: Set connection var ansible_shell_type to sh 29946 1726882585.21012: Set connection var ansible_connection to ssh 29946 1726882585.21029: variable 'ansible_shell_executable' from source: unknown 29946 1726882585.21032: variable 'ansible_connection' from source: unknown 29946 1726882585.21036: variable 'ansible_module_compression' from source: unknown 29946 1726882585.21039: variable 'ansible_shell_type' from source: unknown 29946 1726882585.21041: variable 'ansible_shell_executable' from source: unknown 29946 1726882585.21044: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882585.21046: variable 'ansible_pipelining' from source: unknown 29946 1726882585.21048: variable 'ansible_timeout' from source: unknown 29946 1726882585.21050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882585.21187: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882585.21198: variable 'omit' from source: magic vars 29946 1726882585.21203: starting attempt loop 29946 1726882585.21205: running the handler 29946 1726882585.21218: _low_level_execute_command(): starting 29946 1726882585.21224: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882585.21720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.21724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882585.21727: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.21779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.21787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882585.21791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.21851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.23457: stdout chunk (state=3): >>>/root <<< 29946 1726882585.23555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.23583: stderr chunk (state=3): >>><<< 29946 1726882585.23586: stdout chunk (state=3): >>><<< 29946 1726882585.23608: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882585.23620: _low_level_execute_command(): starting 29946 1726882585.23625: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846 `" && echo ansible-tmp-1726882585.236084-30511-105089958938846="` echo /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846 `" ) && sleep 0' 29946 1726882585.24052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882585.24055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882585.24057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.24066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.24068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.24115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.24118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.24181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.26036: stdout chunk (state=3): >>>ansible-tmp-1726882585.236084-30511-105089958938846=/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846 <<< 29946 1726882585.26143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.26168: stderr chunk (state=3): >>><<< 29946 1726882585.26171: stdout chunk (state=3): >>><<< 29946 1726882585.26184: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882585.236084-30511-105089958938846=/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882585.26222: variable 'ansible_module_compression' from source: unknown 29946 1726882585.26256: ANSIBALLZ: Using lock for package_facts 29946 1726882585.26260: ANSIBALLZ: Acquiring lock 29946 1726882585.26262: ANSIBALLZ: Lock acquired: 140626579498544 29946 1726882585.26268: ANSIBALLZ: Creating module 29946 1726882585.43646: ANSIBALLZ: Writing module into payload 29946 1726882585.43738: ANSIBALLZ: Writing module 29946 1726882585.43758: ANSIBALLZ: Renaming module 29946 1726882585.43764: ANSIBALLZ: Done creating module 29946 1726882585.43795: variable 'ansible_facts' from source: unknown 29946 1726882585.43907: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py 29946 1726882585.44013: Sending initial data 29946 1726882585.44017: Sent initial data (161 bytes) 29946 1726882585.44468: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.44471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882585.44476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.44478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.44480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.44535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.44538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882585.44540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.44615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.46235: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882585.46239: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882585.46291: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882585.46352: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp9flcel3p /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py <<< 29946 1726882585.46358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py" <<< 29946 1726882585.46418: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp9flcel3p" to remote "/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py" <<< 29946 1726882585.47549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.47591: stderr chunk (state=3): >>><<< 29946 1726882585.47596: stdout chunk (state=3): >>><<< 29946 1726882585.47638: done transferring module to remote 29946 1726882585.47647: _low_level_execute_command(): starting 29946 1726882585.47652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/ /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py && sleep 0' 29946 1726882585.48092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882585.48097: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.48103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882585.48105: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.48107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.48153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.48159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.48222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.50025: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.50029: stdout chunk (state=3): >>><<< 29946 1726882585.50031: stderr chunk (state=3): >>><<< 29946 1726882585.50048: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882585.50160: _low_level_execute_command(): starting 29946 1726882585.50164: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/AnsiballZ_package_facts.py && sleep 0' 29946 1726882585.50729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.50732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.50734: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882585.50736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882585.50789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882585.50809: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882585.50832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882585.50936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882585.94682: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 29946 1726882585.94779: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 29946 1726882585.94797: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-resc<<< 29946 1726882585.94866: stdout chunk (state=3): >>>ue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29946 1726882585.96645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882585.96660: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 29946 1726882585.96728: stderr chunk (state=3): >>><<< 29946 1726882585.96763: stdout chunk (state=3): >>><<< 29946 1726882585.96785: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882585.99813: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882585.99818: _low_level_execute_command(): starting 29946 1726882585.99820: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882585.236084-30511-105089958938846/ > /dev/null 2>&1 && sleep 0' 29946 1726882586.00530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882586.00534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882586.00536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882586.00631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882586.00685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882586.02534: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882586.02564: stderr chunk (state=3): >>><<< 29946 1726882586.02566: stdout chunk (state=3): >>><<< 29946 1726882586.02575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882586.02652: handler run complete 29946 1726882586.03046: variable 'ansible_facts' from source: unknown 29946 1726882586.03352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.07778: variable 'ansible_facts' from source: unknown 29946 1726882586.08200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.08820: attempt loop complete, returning result 29946 1726882586.08865: _execute() done 29946 1726882586.08869: dumping result to json 29946 1726882586.09027: done dumping result, returning 29946 1726882586.09034: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-95e7-9dfb-0000000003d6] 29946 1726882586.09037: sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d6 29946 1726882586.11483: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000003d6 29946 1726882586.11490: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882586.11581: no more pending results, returning what we have 29946 1726882586.11594: results queue empty 29946 1726882586.11596: checking for any_errors_fatal 29946 1726882586.11602: done checking for any_errors_fatal 29946 1726882586.11603: checking for max_fail_percentage 29946 1726882586.11605: done checking for max_fail_percentage 29946 1726882586.11606: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.11606: done checking to see if all hosts have failed 29946 1726882586.11607: getting the remaining hosts for this loop 29946 1726882586.11608: done getting the remaining hosts for this loop 29946 1726882586.11612: getting the next task for host managed_node2 29946 1726882586.11619: done getting next task for host managed_node2 29946 1726882586.11622: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882586.11625: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.11635: getting variables 29946 1726882586.11636: in VariableManager get_vars() 29946 1726882586.11664: Calling all_inventory to load vars for managed_node2 29946 1726882586.11666: Calling groups_inventory to load vars for managed_node2 29946 1726882586.11668: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.11676: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.11678: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.11681: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.13023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.14707: done with get_vars() 29946 1726882586.14729: done getting variables 29946 1726882586.14796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:26 -0400 (0:00:00.950) 0:00:12.257 ****** 29946 1726882586.14828: entering _queue_task() for managed_node2/debug 29946 1726882586.15153: worker is 1 (out of 1 available) 29946 1726882586.15164: exiting _queue_task() for managed_node2/debug 29946 1726882586.15295: done queuing things up, now waiting for results queue to drain 29946 1726882586.15297: waiting for pending results... 29946 1726882586.15524: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882586.15606: in run() - task 12673a56-9f93-95e7-9dfb-000000000018 29946 1726882586.15640: variable 'ansible_search_path' from source: unknown 29946 1726882586.15648: variable 'ansible_search_path' from source: unknown 29946 1726882586.15685: calling self._execute() 29946 1726882586.15798: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.15839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.15842: variable 'omit' from source: magic vars 29946 1726882586.16242: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.16261: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.16282: variable 'omit' from source: magic vars 29946 1726882586.16388: variable 'omit' from source: magic vars 29946 1726882586.16456: variable 'network_provider' from source: set_fact 29946 1726882586.16480: variable 'omit' from source: magic vars 29946 1726882586.16537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882586.16577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882586.16617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882586.16698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882586.16708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882586.16711: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882586.16713: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.16715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.16830: Set connection var ansible_pipelining to False 29946 1726882586.16844: Set connection var ansible_shell_executable to /bin/sh 29946 1726882586.16856: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882586.16865: Set connection var ansible_timeout to 10 29946 1726882586.16875: Set connection var ansible_shell_type to sh 29946 1726882586.16881: Set connection var ansible_connection to ssh 29946 1726882586.16910: variable 'ansible_shell_executable' from source: unknown 29946 1726882586.16928: variable 'ansible_connection' from source: unknown 29946 1726882586.16944: variable 'ansible_module_compression' from source: unknown 29946 1726882586.16947: variable 'ansible_shell_type' from source: unknown 29946 1726882586.16949: variable 'ansible_shell_executable' from source: unknown 29946 1726882586.17038: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.17042: variable 'ansible_pipelining' from source: unknown 29946 1726882586.17044: variable 'ansible_timeout' from source: unknown 29946 1726882586.17046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.17129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882586.17152: variable 'omit' from source: magic vars 29946 1726882586.17167: starting attempt loop 29946 1726882586.17173: running the handler 29946 1726882586.17225: handler run complete 29946 1726882586.17244: attempt loop complete, returning result 29946 1726882586.17260: _execute() done 29946 1726882586.17271: dumping result to json 29946 1726882586.17278: done dumping result, returning 29946 1726882586.17291: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-95e7-9dfb-000000000018] 29946 1726882586.17302: sending task result for task 12673a56-9f93-95e7-9dfb-000000000018 ok: [managed_node2] => {} MSG: Using network provider: nm 29946 1726882586.17568: no more pending results, returning what we have 29946 1726882586.17571: results queue empty 29946 1726882586.17573: checking for any_errors_fatal 29946 1726882586.17595: done checking for any_errors_fatal 29946 1726882586.17596: checking for max_fail_percentage 29946 1726882586.17598: done checking for max_fail_percentage 29946 1726882586.17599: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.17600: done checking to see if all hosts have failed 29946 1726882586.17601: getting the remaining hosts for this loop 29946 1726882586.17602: done getting the remaining hosts for this loop 29946 1726882586.17606: getting the next task for host managed_node2 29946 1726882586.17614: done getting next task for host managed_node2 29946 1726882586.17618: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882586.17622: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.17635: getting variables 29946 1726882586.17637: in VariableManager get_vars() 29946 1726882586.17674: Calling all_inventory to load vars for managed_node2 29946 1726882586.17676: Calling groups_inventory to load vars for managed_node2 29946 1726882586.17679: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.17809: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.17813: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.17819: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000018 29946 1726882586.17822: WORKER PROCESS EXITING 29946 1726882586.17825: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.19307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.21168: done with get_vars() 29946 1726882586.21191: done getting variables 29946 1726882586.21255: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:26 -0400 (0:00:00.064) 0:00:12.322 ****** 29946 1726882586.21297: entering _queue_task() for managed_node2/fail 29946 1726882586.21807: worker is 1 (out of 1 available) 29946 1726882586.21817: exiting _queue_task() for managed_node2/fail 29946 1726882586.21827: done queuing things up, now waiting for results queue to drain 29946 1726882586.21828: waiting for pending results... 29946 1726882586.21914: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882586.22060: in run() - task 12673a56-9f93-95e7-9dfb-000000000019 29946 1726882586.22082: variable 'ansible_search_path' from source: unknown 29946 1726882586.22095: variable 'ansible_search_path' from source: unknown 29946 1726882586.22135: calling self._execute() 29946 1726882586.22237: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.22249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.22263: variable 'omit' from source: magic vars 29946 1726882586.22663: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.22680: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.22827: variable 'network_state' from source: role '' defaults 29946 1726882586.22898: Evaluated conditional (network_state != {}): False 29946 1726882586.22902: when evaluation is False, skipping this task 29946 1726882586.22904: _execute() done 29946 1726882586.22907: dumping result to json 29946 1726882586.22909: done dumping result, returning 29946 1726882586.22911: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-95e7-9dfb-000000000019] 29946 1726882586.22913: sending task result for task 12673a56-9f93-95e7-9dfb-000000000019 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882586.23080: no more pending results, returning what we have 29946 1726882586.23084: results queue empty 29946 1726882586.23088: checking for any_errors_fatal 29946 1726882586.23098: done checking for any_errors_fatal 29946 1726882586.23099: checking for max_fail_percentage 29946 1726882586.23101: done checking for max_fail_percentage 29946 1726882586.23102: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.23103: done checking to see if all hosts have failed 29946 1726882586.23104: getting the remaining hosts for this loop 29946 1726882586.23105: done getting the remaining hosts for this loop 29946 1726882586.23108: getting the next task for host managed_node2 29946 1726882586.23115: done getting next task for host managed_node2 29946 1726882586.23119: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882586.23123: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.23153: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000019 29946 1726882586.23156: WORKER PROCESS EXITING 29946 1726882586.23264: getting variables 29946 1726882586.23267: in VariableManager get_vars() 29946 1726882586.23324: Calling all_inventory to load vars for managed_node2 29946 1726882586.23327: Calling groups_inventory to load vars for managed_node2 29946 1726882586.23331: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.23344: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.23351: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.23355: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.24970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.28309: done with get_vars() 29946 1726882586.28449: done getting variables 29946 1726882586.28594: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:26 -0400 (0:00:00.073) 0:00:12.395 ****** 29946 1726882586.28631: entering _queue_task() for managed_node2/fail 29946 1726882586.29353: worker is 1 (out of 1 available) 29946 1726882586.29366: exiting _queue_task() for managed_node2/fail 29946 1726882586.29377: done queuing things up, now waiting for results queue to drain 29946 1726882586.29379: waiting for pending results... 29946 1726882586.29856: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882586.29982: in run() - task 12673a56-9f93-95e7-9dfb-00000000001a 29946 1726882586.29989: variable 'ansible_search_path' from source: unknown 29946 1726882586.29992: variable 'ansible_search_path' from source: unknown 29946 1726882586.30254: calling self._execute() 29946 1726882586.30301: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.30309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.30316: variable 'omit' from source: magic vars 29946 1726882586.31077: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.31157: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.31202: variable 'network_state' from source: role '' defaults 29946 1726882586.31213: Evaluated conditional (network_state != {}): False 29946 1726882586.31217: when evaluation is False, skipping this task 29946 1726882586.31219: _execute() done 29946 1726882586.31222: dumping result to json 29946 1726882586.31225: done dumping result, returning 29946 1726882586.31234: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-95e7-9dfb-00000000001a] 29946 1726882586.31239: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001a 29946 1726882586.31421: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001a 29946 1726882586.31424: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882586.31467: no more pending results, returning what we have 29946 1726882586.31470: results queue empty 29946 1726882586.31542: checking for any_errors_fatal 29946 1726882586.31551: done checking for any_errors_fatal 29946 1726882586.31552: checking for max_fail_percentage 29946 1726882586.31553: done checking for max_fail_percentage 29946 1726882586.31554: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.31555: done checking to see if all hosts have failed 29946 1726882586.31556: getting the remaining hosts for this loop 29946 1726882586.31557: done getting the remaining hosts for this loop 29946 1726882586.31560: getting the next task for host managed_node2 29946 1726882586.31566: done getting next task for host managed_node2 29946 1726882586.31570: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882586.31573: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.31587: getting variables 29946 1726882586.31588: in VariableManager get_vars() 29946 1726882586.31623: Calling all_inventory to load vars for managed_node2 29946 1726882586.31626: Calling groups_inventory to load vars for managed_node2 29946 1726882586.31628: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.31636: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.31639: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.31642: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.33050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.34576: done with get_vars() 29946 1726882586.34602: done getting variables 29946 1726882586.34659: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:26 -0400 (0:00:00.060) 0:00:12.456 ****** 29946 1726882586.34692: entering _queue_task() for managed_node2/fail 29946 1726882586.35007: worker is 1 (out of 1 available) 29946 1726882586.35019: exiting _queue_task() for managed_node2/fail 29946 1726882586.35030: done queuing things up, now waiting for results queue to drain 29946 1726882586.35031: waiting for pending results... 29946 1726882586.35416: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882586.35698: in run() - task 12673a56-9f93-95e7-9dfb-00000000001b 29946 1726882586.35702: variable 'ansible_search_path' from source: unknown 29946 1726882586.35706: variable 'ansible_search_path' from source: unknown 29946 1726882586.35708: calling self._execute() 29946 1726882586.35711: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.35714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.35716: variable 'omit' from source: magic vars 29946 1726882586.36053: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.36057: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.36134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882586.38341: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882586.38423: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882586.38466: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882586.38499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882586.38524: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882586.38605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.38728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.38731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.38734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.38736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.38816: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.38835: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29946 1726882586.38950: variable 'ansible_distribution' from source: facts 29946 1726882586.38953: variable '__network_rh_distros' from source: role '' defaults 29946 1726882586.38964: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29946 1726882586.39229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.39252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.39275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.39323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.39340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.39389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.39444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.39468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.39506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.39520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.39597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.39600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.39812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.39848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.39862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.40512: variable 'network_connections' from source: task vars 29946 1726882586.40515: variable 'interface' from source: set_fact 29946 1726882586.40704: variable 'interface' from source: set_fact 29946 1726882586.40707: variable 'interface' from source: set_fact 29946 1726882586.40756: variable 'interface' from source: set_fact 29946 1726882586.40817: variable 'network_state' from source: role '' defaults 29946 1726882586.40869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882586.41100: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882586.41109: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882586.41117: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882586.41147: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882586.41191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882586.41216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882586.41239: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.41263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882586.41300: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29946 1726882586.41304: when evaluation is False, skipping this task 29946 1726882586.41306: _execute() done 29946 1726882586.41308: dumping result to json 29946 1726882586.41316: done dumping result, returning 29946 1726882586.41325: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-95e7-9dfb-00000000001b] 29946 1726882586.41335: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001b 29946 1726882586.41515: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001b 29946 1726882586.41518: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29946 1726882586.41567: no more pending results, returning what we have 29946 1726882586.41570: results queue empty 29946 1726882586.41571: checking for any_errors_fatal 29946 1726882586.41579: done checking for any_errors_fatal 29946 1726882586.41580: checking for max_fail_percentage 29946 1726882586.41581: done checking for max_fail_percentage 29946 1726882586.41582: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.41583: done checking to see if all hosts have failed 29946 1726882586.41584: getting the remaining hosts for this loop 29946 1726882586.41585: done getting the remaining hosts for this loop 29946 1726882586.41589: getting the next task for host managed_node2 29946 1726882586.41597: done getting next task for host managed_node2 29946 1726882586.41601: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882586.41604: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.41619: getting variables 29946 1726882586.41621: in VariableManager get_vars() 29946 1726882586.41660: Calling all_inventory to load vars for managed_node2 29946 1726882586.41663: Calling groups_inventory to load vars for managed_node2 29946 1726882586.41666: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.41681: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.41684: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.41686: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.43240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.45316: done with get_vars() 29946 1726882586.45340: done getting variables 29946 1726882586.45432: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:26 -0400 (0:00:00.107) 0:00:12.564 ****** 29946 1726882586.45462: entering _queue_task() for managed_node2/dnf 29946 1726882586.45765: worker is 1 (out of 1 available) 29946 1726882586.45777: exiting _queue_task() for managed_node2/dnf 29946 1726882586.45788: done queuing things up, now waiting for results queue to drain 29946 1726882586.45790: waiting for pending results... 29946 1726882586.46113: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882586.46179: in run() - task 12673a56-9f93-95e7-9dfb-00000000001c 29946 1726882586.46192: variable 'ansible_search_path' from source: unknown 29946 1726882586.46199: variable 'ansible_search_path' from source: unknown 29946 1726882586.46236: calling self._execute() 29946 1726882586.46318: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.46325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.46334: variable 'omit' from source: magic vars 29946 1726882586.46694: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.46704: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.46891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882586.49383: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882586.50114: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882586.50263: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882586.50298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882586.50323: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882586.50475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.50514: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.50546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.50609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.50629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.50796: variable 'ansible_distribution' from source: facts 29946 1726882586.50799: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.50802: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29946 1726882586.50916: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882586.51059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.51090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.51128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.51198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.51202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.51245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.51274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.51307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.51398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.51402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.51420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.51456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.51484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.51531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.51558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.51723: variable 'network_connections' from source: task vars 29946 1726882586.51770: variable 'interface' from source: set_fact 29946 1726882586.51820: variable 'interface' from source: set_fact 29946 1726882586.51834: variable 'interface' from source: set_fact 29946 1726882586.51906: variable 'interface' from source: set_fact 29946 1726882586.52197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882586.52242: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882586.52372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882586.52700: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882586.52704: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882586.52706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882586.52732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882586.52766: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.52832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882586.52982: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882586.53468: variable 'network_connections' from source: task vars 29946 1726882586.53481: variable 'interface' from source: set_fact 29946 1726882586.53545: variable 'interface' from source: set_fact 29946 1726882586.53900: variable 'interface' from source: set_fact 29946 1726882586.53903: variable 'interface' from source: set_fact 29946 1726882586.53998: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882586.54003: when evaluation is False, skipping this task 29946 1726882586.54005: _execute() done 29946 1726882586.54007: dumping result to json 29946 1726882586.54009: done dumping result, returning 29946 1726882586.54011: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-00000000001c] 29946 1726882586.54014: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001c skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882586.54127: no more pending results, returning what we have 29946 1726882586.54130: results queue empty 29946 1726882586.54131: checking for any_errors_fatal 29946 1726882586.54138: done checking for any_errors_fatal 29946 1726882586.54139: checking for max_fail_percentage 29946 1726882586.54140: done checking for max_fail_percentage 29946 1726882586.54141: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.54142: done checking to see if all hosts have failed 29946 1726882586.54142: getting the remaining hosts for this loop 29946 1726882586.54144: done getting the remaining hosts for this loop 29946 1726882586.54147: getting the next task for host managed_node2 29946 1726882586.54153: done getting next task for host managed_node2 29946 1726882586.54157: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882586.54160: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.54174: getting variables 29946 1726882586.54176: in VariableManager get_vars() 29946 1726882586.54214: Calling all_inventory to load vars for managed_node2 29946 1726882586.54217: Calling groups_inventory to load vars for managed_node2 29946 1726882586.54219: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.54229: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.54231: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.54234: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.54823: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001c 29946 1726882586.54827: WORKER PROCESS EXITING 29946 1726882586.56979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.58842: done with get_vars() 29946 1726882586.58865: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882586.58950: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:26 -0400 (0:00:00.135) 0:00:12.699 ****** 29946 1726882586.58981: entering _queue_task() for managed_node2/yum 29946 1726882586.58983: Creating lock for yum 29946 1726882586.59409: worker is 1 (out of 1 available) 29946 1726882586.59425: exiting _queue_task() for managed_node2/yum 29946 1726882586.59436: done queuing things up, now waiting for results queue to drain 29946 1726882586.59437: waiting for pending results... 29946 1726882586.59619: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882586.59794: in run() - task 12673a56-9f93-95e7-9dfb-00000000001d 29946 1726882586.59814: variable 'ansible_search_path' from source: unknown 29946 1726882586.59820: variable 'ansible_search_path' from source: unknown 29946 1726882586.59860: calling self._execute() 29946 1726882586.59978: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.59999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.60015: variable 'omit' from source: magic vars 29946 1726882586.60441: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.60457: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.60647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882586.63372: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882586.63490: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882586.63538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882586.63582: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882586.63623: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882586.63716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.63768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.63788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.63878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.63940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.64104: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.64109: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29946 1726882586.64123: when evaluation is False, skipping this task 29946 1726882586.64138: _execute() done 29946 1726882586.64238: dumping result to json 29946 1726882586.64241: done dumping result, returning 29946 1726882586.64248: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-00000000001d] 29946 1726882586.64252: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001d skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29946 1726882586.64477: no more pending results, returning what we have 29946 1726882586.64482: results queue empty 29946 1726882586.64484: checking for any_errors_fatal 29946 1726882586.64495: done checking for any_errors_fatal 29946 1726882586.64496: checking for max_fail_percentage 29946 1726882586.64501: done checking for max_fail_percentage 29946 1726882586.64501: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.64502: done checking to see if all hosts have failed 29946 1726882586.64503: getting the remaining hosts for this loop 29946 1726882586.64508: done getting the remaining hosts for this loop 29946 1726882586.64513: getting the next task for host managed_node2 29946 1726882586.64521: done getting next task for host managed_node2 29946 1726882586.64525: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882586.64529: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.64551: getting variables 29946 1726882586.64553: in VariableManager get_vars() 29946 1726882586.64837: Calling all_inventory to load vars for managed_node2 29946 1726882586.64840: Calling groups_inventory to load vars for managed_node2 29946 1726882586.64845: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.64861: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.64864: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.64867: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.65424: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001d 29946 1726882586.65427: WORKER PROCESS EXITING 29946 1726882586.67008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.68751: done with get_vars() 29946 1726882586.68772: done getting variables 29946 1726882586.68835: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:26 -0400 (0:00:00.098) 0:00:12.798 ****** 29946 1726882586.68869: entering _queue_task() for managed_node2/fail 29946 1726882586.69200: worker is 1 (out of 1 available) 29946 1726882586.69213: exiting _queue_task() for managed_node2/fail 29946 1726882586.69223: done queuing things up, now waiting for results queue to drain 29946 1726882586.69338: waiting for pending results... 29946 1726882586.69525: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882586.69674: in run() - task 12673a56-9f93-95e7-9dfb-00000000001e 29946 1726882586.69697: variable 'ansible_search_path' from source: unknown 29946 1726882586.69705: variable 'ansible_search_path' from source: unknown 29946 1726882586.69743: calling self._execute() 29946 1726882586.69840: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.69852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.69864: variable 'omit' from source: magic vars 29946 1726882586.70318: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.70322: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.70384: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882586.70660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882586.79925: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882586.80139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882586.80204: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882586.80254: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882586.80381: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882586.80533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.80605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.80708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.80712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.80715: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.80725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.80751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.80777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.80849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.80864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.80903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.80930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.80953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.80991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.81007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.81164: variable 'network_connections' from source: task vars 29946 1726882586.81251: variable 'interface' from source: set_fact 29946 1726882586.81284: variable 'interface' from source: set_fact 29946 1726882586.81294: variable 'interface' from source: set_fact 29946 1726882586.81351: variable 'interface' from source: set_fact 29946 1726882586.81468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882586.81775: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882586.81811: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882586.81845: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882586.81874: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882586.82114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882586.82312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882586.82318: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.82324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882586.82326: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882586.82645: variable 'network_connections' from source: task vars 29946 1726882586.82681: variable 'interface' from source: set_fact 29946 1726882586.82750: variable 'interface' from source: set_fact 29946 1726882586.82770: variable 'interface' from source: set_fact 29946 1726882586.82850: variable 'interface' from source: set_fact 29946 1726882586.82937: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882586.82944: when evaluation is False, skipping this task 29946 1726882586.82950: _execute() done 29946 1726882586.82955: dumping result to json 29946 1726882586.82961: done dumping result, returning 29946 1726882586.82982: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-00000000001e] 29946 1726882586.83122: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001e 29946 1726882586.83198: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001e 29946 1726882586.83202: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882586.83260: no more pending results, returning what we have 29946 1726882586.83264: results queue empty 29946 1726882586.83265: checking for any_errors_fatal 29946 1726882586.83271: done checking for any_errors_fatal 29946 1726882586.83272: checking for max_fail_percentage 29946 1726882586.83273: done checking for max_fail_percentage 29946 1726882586.83274: checking to see if all hosts have failed and the running result is not ok 29946 1726882586.83275: done checking to see if all hosts have failed 29946 1726882586.83275: getting the remaining hosts for this loop 29946 1726882586.83277: done getting the remaining hosts for this loop 29946 1726882586.83281: getting the next task for host managed_node2 29946 1726882586.83289: done getting next task for host managed_node2 29946 1726882586.83292: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29946 1726882586.83297: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882586.83311: getting variables 29946 1726882586.83313: in VariableManager get_vars() 29946 1726882586.83348: Calling all_inventory to load vars for managed_node2 29946 1726882586.83351: Calling groups_inventory to load vars for managed_node2 29946 1726882586.83353: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882586.83362: Calling all_plugins_play to load vars for managed_node2 29946 1726882586.83365: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882586.83368: Calling groups_plugins_play to load vars for managed_node2 29946 1726882586.90734: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882586.92542: done with get_vars() 29946 1726882586.92563: done getting variables 29946 1726882586.92619: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:26 -0400 (0:00:00.237) 0:00:13.036 ****** 29946 1726882586.92648: entering _queue_task() for managed_node2/package 29946 1726882586.93040: worker is 1 (out of 1 available) 29946 1726882586.93052: exiting _queue_task() for managed_node2/package 29946 1726882586.93064: done queuing things up, now waiting for results queue to drain 29946 1726882586.93066: waiting for pending results... 29946 1726882586.93514: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 29946 1726882586.93519: in run() - task 12673a56-9f93-95e7-9dfb-00000000001f 29946 1726882586.93522: variable 'ansible_search_path' from source: unknown 29946 1726882586.93526: variable 'ansible_search_path' from source: unknown 29946 1726882586.93529: calling self._execute() 29946 1726882586.93582: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882586.93591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882586.93598: variable 'omit' from source: magic vars 29946 1726882586.94046: variable 'ansible_distribution_major_version' from source: facts 29946 1726882586.94050: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882586.94201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882586.94472: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882586.94801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882586.94805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882586.94807: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882586.94810: variable 'network_packages' from source: role '' defaults 29946 1726882586.94998: variable '__network_provider_setup' from source: role '' defaults 29946 1726882586.95002: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882586.95004: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882586.95007: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882586.95009: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882586.95144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882586.97099: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882586.97155: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882586.97196: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882586.97229: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882586.97265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882586.97342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.97369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.97392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.97437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.97450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.97492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.97518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.97598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.97602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.97605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.97815: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882586.97921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.97946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.97970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.98009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.98023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.98111: variable 'ansible_python' from source: facts 29946 1726882586.98161: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882586.98217: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882586.98301: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882586.98428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.98489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.98495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.98516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.98529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.98572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882586.98701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882586.98705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.98708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882586.98710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882586.98826: variable 'network_connections' from source: task vars 29946 1726882586.98832: variable 'interface' from source: set_fact 29946 1726882586.98936: variable 'interface' from source: set_fact 29946 1726882586.98945: variable 'interface' from source: set_fact 29946 1726882586.99047: variable 'interface' from source: set_fact 29946 1726882586.99146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882586.99172: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882586.99206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882586.99242: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882586.99288: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882586.99537: variable 'network_connections' from source: task vars 29946 1726882586.99540: variable 'interface' from source: set_fact 29946 1726882586.99636: variable 'interface' from source: set_fact 29946 1726882586.99683: variable 'interface' from source: set_fact 29946 1726882586.99736: variable 'interface' from source: set_fact 29946 1726882586.99809: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882586.99876: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882587.00179: variable 'network_connections' from source: task vars 29946 1726882587.00182: variable 'interface' from source: set_fact 29946 1726882587.00335: variable 'interface' from source: set_fact 29946 1726882587.00338: variable 'interface' from source: set_fact 29946 1726882587.00341: variable 'interface' from source: set_fact 29946 1726882587.00370: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882587.00443: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882587.00715: variable 'network_connections' from source: task vars 29946 1726882587.00718: variable 'interface' from source: set_fact 29946 1726882587.00779: variable 'interface' from source: set_fact 29946 1726882587.00788: variable 'interface' from source: set_fact 29946 1726882587.00846: variable 'interface' from source: set_fact 29946 1726882587.00924: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882587.00977: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882587.00989: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882587.01045: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882587.01310: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882587.01745: variable 'network_connections' from source: task vars 29946 1726882587.01749: variable 'interface' from source: set_fact 29946 1726882587.01816: variable 'interface' from source: set_fact 29946 1726882587.01822: variable 'interface' from source: set_fact 29946 1726882587.01882: variable 'interface' from source: set_fact 29946 1726882587.01912: variable 'ansible_distribution' from source: facts 29946 1726882587.01915: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.01922: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.01944: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882587.02116: variable 'ansible_distribution' from source: facts 29946 1726882587.02119: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.02181: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.02184: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882587.02305: variable 'ansible_distribution' from source: facts 29946 1726882587.02309: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.02315: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.02349: variable 'network_provider' from source: set_fact 29946 1726882587.02364: variable 'ansible_facts' from source: unknown 29946 1726882587.03270: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29946 1726882587.03274: when evaluation is False, skipping this task 29946 1726882587.03276: _execute() done 29946 1726882587.03280: dumping result to json 29946 1726882587.03283: done dumping result, returning 29946 1726882587.03377: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-95e7-9dfb-00000000001f] 29946 1726882587.03380: sending task result for task 12673a56-9f93-95e7-9dfb-00000000001f 29946 1726882587.03447: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000001f 29946 1726882587.03450: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29946 1726882587.03530: no more pending results, returning what we have 29946 1726882587.03534: results queue empty 29946 1726882587.03535: checking for any_errors_fatal 29946 1726882587.03545: done checking for any_errors_fatal 29946 1726882587.03546: checking for max_fail_percentage 29946 1726882587.03548: done checking for max_fail_percentage 29946 1726882587.03549: checking to see if all hosts have failed and the running result is not ok 29946 1726882587.03549: done checking to see if all hosts have failed 29946 1726882587.03550: getting the remaining hosts for this loop 29946 1726882587.03551: done getting the remaining hosts for this loop 29946 1726882587.03555: getting the next task for host managed_node2 29946 1726882587.03562: done getting next task for host managed_node2 29946 1726882587.03567: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882587.03570: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882587.03590: getting variables 29946 1726882587.03592: in VariableManager get_vars() 29946 1726882587.03633: Calling all_inventory to load vars for managed_node2 29946 1726882587.03635: Calling groups_inventory to load vars for managed_node2 29946 1726882587.03637: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882587.03648: Calling all_plugins_play to load vars for managed_node2 29946 1726882587.03650: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882587.03652: Calling groups_plugins_play to load vars for managed_node2 29946 1726882587.05315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882587.06977: done with get_vars() 29946 1726882587.07001: done getting variables 29946 1726882587.07060: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:27 -0400 (0:00:00.144) 0:00:13.180 ****** 29946 1726882587.07095: entering _queue_task() for managed_node2/package 29946 1726882587.07417: worker is 1 (out of 1 available) 29946 1726882587.07428: exiting _queue_task() for managed_node2/package 29946 1726882587.07438: done queuing things up, now waiting for results queue to drain 29946 1726882587.07440: waiting for pending results... 29946 1726882587.07741: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882587.07800: in run() - task 12673a56-9f93-95e7-9dfb-000000000020 29946 1726882587.07818: variable 'ansible_search_path' from source: unknown 29946 1726882587.07822: variable 'ansible_search_path' from source: unknown 29946 1726882587.07855: calling self._execute() 29946 1726882587.07946: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.07950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.07959: variable 'omit' from source: magic vars 29946 1726882587.08318: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.08330: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882587.08494: variable 'network_state' from source: role '' defaults 29946 1726882587.08499: Evaluated conditional (network_state != {}): False 29946 1726882587.08502: when evaluation is False, skipping this task 29946 1726882587.08504: _execute() done 29946 1726882587.08507: dumping result to json 29946 1726882587.08509: done dumping result, returning 29946 1726882587.08512: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-95e7-9dfb-000000000020] 29946 1726882587.08514: sending task result for task 12673a56-9f93-95e7-9dfb-000000000020 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882587.08624: no more pending results, returning what we have 29946 1726882587.08629: results queue empty 29946 1726882587.08630: checking for any_errors_fatal 29946 1726882587.08639: done checking for any_errors_fatal 29946 1726882587.08640: checking for max_fail_percentage 29946 1726882587.08642: done checking for max_fail_percentage 29946 1726882587.08643: checking to see if all hosts have failed and the running result is not ok 29946 1726882587.08644: done checking to see if all hosts have failed 29946 1726882587.08644: getting the remaining hosts for this loop 29946 1726882587.08646: done getting the remaining hosts for this loop 29946 1726882587.08649: getting the next task for host managed_node2 29946 1726882587.08657: done getting next task for host managed_node2 29946 1726882587.08661: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882587.08665: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882587.08681: getting variables 29946 1726882587.08683: in VariableManager get_vars() 29946 1726882587.08725: Calling all_inventory to load vars for managed_node2 29946 1726882587.08727: Calling groups_inventory to load vars for managed_node2 29946 1726882587.08730: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882587.08736: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000020 29946 1726882587.08739: WORKER PROCESS EXITING 29946 1726882587.08961: Calling all_plugins_play to load vars for managed_node2 29946 1726882587.08965: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882587.08968: Calling groups_plugins_play to load vars for managed_node2 29946 1726882587.10171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882587.11728: done with get_vars() 29946 1726882587.11748: done getting variables 29946 1726882587.11804: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:27 -0400 (0:00:00.047) 0:00:13.227 ****** 29946 1726882587.11834: entering _queue_task() for managed_node2/package 29946 1726882587.12084: worker is 1 (out of 1 available) 29946 1726882587.12096: exiting _queue_task() for managed_node2/package 29946 1726882587.12107: done queuing things up, now waiting for results queue to drain 29946 1726882587.12108: waiting for pending results... 29946 1726882587.12361: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882587.12474: in run() - task 12673a56-9f93-95e7-9dfb-000000000021 29946 1726882587.12490: variable 'ansible_search_path' from source: unknown 29946 1726882587.12496: variable 'ansible_search_path' from source: unknown 29946 1726882587.12527: calling self._execute() 29946 1726882587.12612: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.12618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.12629: variable 'omit' from source: magic vars 29946 1726882587.12976: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.12989: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882587.13182: variable 'network_state' from source: role '' defaults 29946 1726882587.13188: Evaluated conditional (network_state != {}): False 29946 1726882587.13191: when evaluation is False, skipping this task 29946 1726882587.13194: _execute() done 29946 1726882587.13196: dumping result to json 29946 1726882587.13198: done dumping result, returning 29946 1726882587.13200: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-95e7-9dfb-000000000021] 29946 1726882587.13202: sending task result for task 12673a56-9f93-95e7-9dfb-000000000021 29946 1726882587.13266: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000021 29946 1726882587.13269: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882587.13315: no more pending results, returning what we have 29946 1726882587.13319: results queue empty 29946 1726882587.13320: checking for any_errors_fatal 29946 1726882587.13329: done checking for any_errors_fatal 29946 1726882587.13330: checking for max_fail_percentage 29946 1726882587.13332: done checking for max_fail_percentage 29946 1726882587.13333: checking to see if all hosts have failed and the running result is not ok 29946 1726882587.13333: done checking to see if all hosts have failed 29946 1726882587.13334: getting the remaining hosts for this loop 29946 1726882587.13335: done getting the remaining hosts for this loop 29946 1726882587.13339: getting the next task for host managed_node2 29946 1726882587.13345: done getting next task for host managed_node2 29946 1726882587.13350: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882587.13354: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882587.13369: getting variables 29946 1726882587.13371: in VariableManager get_vars() 29946 1726882587.13409: Calling all_inventory to load vars for managed_node2 29946 1726882587.13412: Calling groups_inventory to load vars for managed_node2 29946 1726882587.13414: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882587.13425: Calling all_plugins_play to load vars for managed_node2 29946 1726882587.13427: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882587.13429: Calling groups_plugins_play to load vars for managed_node2 29946 1726882587.14959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882587.16433: done with get_vars() 29946 1726882587.16454: done getting variables 29946 1726882587.16549: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:27 -0400 (0:00:00.047) 0:00:13.275 ****** 29946 1726882587.16579: entering _queue_task() for managed_node2/service 29946 1726882587.16581: Creating lock for service 29946 1726882587.16857: worker is 1 (out of 1 available) 29946 1726882587.16869: exiting _queue_task() for managed_node2/service 29946 1726882587.16880: done queuing things up, now waiting for results queue to drain 29946 1726882587.16882: waiting for pending results... 29946 1726882587.17218: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882587.17262: in run() - task 12673a56-9f93-95e7-9dfb-000000000022 29946 1726882587.17274: variable 'ansible_search_path' from source: unknown 29946 1726882587.17277: variable 'ansible_search_path' from source: unknown 29946 1726882587.17312: calling self._execute() 29946 1726882587.17438: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.17442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.17445: variable 'omit' from source: magic vars 29946 1726882587.17753: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.17765: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882587.17873: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882587.18057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882587.20191: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882587.20263: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882587.20300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882587.20332: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882587.20399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882587.20435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.20468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.20504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.20780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.20783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.20788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.20791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.20795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.20798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.20800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.20802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.20804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.20807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.20809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.20811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.20963: variable 'network_connections' from source: task vars 29946 1726882587.20976: variable 'interface' from source: set_fact 29946 1726882587.21298: variable 'interface' from source: set_fact 29946 1726882587.21302: variable 'interface' from source: set_fact 29946 1726882587.21304: variable 'interface' from source: set_fact 29946 1726882587.21307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882587.21363: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882587.21399: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882587.21431: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882587.21459: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882587.21496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882587.21517: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882587.21545: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.21569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882587.21626: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882587.21862: variable 'network_connections' from source: task vars 29946 1726882587.21865: variable 'interface' from source: set_fact 29946 1726882587.21925: variable 'interface' from source: set_fact 29946 1726882587.21931: variable 'interface' from source: set_fact 29946 1726882587.21992: variable 'interface' from source: set_fact 29946 1726882587.22045: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882587.22048: when evaluation is False, skipping this task 29946 1726882587.22051: _execute() done 29946 1726882587.22053: dumping result to json 29946 1726882587.22055: done dumping result, returning 29946 1726882587.22063: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000022] 29946 1726882587.22079: sending task result for task 12673a56-9f93-95e7-9dfb-000000000022 29946 1726882587.22162: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000022 29946 1726882587.22164: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882587.22334: no more pending results, returning what we have 29946 1726882587.22338: results queue empty 29946 1726882587.22338: checking for any_errors_fatal 29946 1726882587.22345: done checking for any_errors_fatal 29946 1726882587.22345: checking for max_fail_percentage 29946 1726882587.22347: done checking for max_fail_percentage 29946 1726882587.22348: checking to see if all hosts have failed and the running result is not ok 29946 1726882587.22349: done checking to see if all hosts have failed 29946 1726882587.22350: getting the remaining hosts for this loop 29946 1726882587.22351: done getting the remaining hosts for this loop 29946 1726882587.22354: getting the next task for host managed_node2 29946 1726882587.22361: done getting next task for host managed_node2 29946 1726882587.22365: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882587.22368: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882587.22381: getting variables 29946 1726882587.22383: in VariableManager get_vars() 29946 1726882587.22421: Calling all_inventory to load vars for managed_node2 29946 1726882587.22423: Calling groups_inventory to load vars for managed_node2 29946 1726882587.22426: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882587.22435: Calling all_plugins_play to load vars for managed_node2 29946 1726882587.22438: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882587.22440: Calling groups_plugins_play to load vars for managed_node2 29946 1726882587.23819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882587.25356: done with get_vars() 29946 1726882587.25381: done getting variables 29946 1726882587.25441: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:27 -0400 (0:00:00.088) 0:00:13.364 ****** 29946 1726882587.25473: entering _queue_task() for managed_node2/service 29946 1726882587.25756: worker is 1 (out of 1 available) 29946 1726882587.25768: exiting _queue_task() for managed_node2/service 29946 1726882587.25780: done queuing things up, now waiting for results queue to drain 29946 1726882587.25782: waiting for pending results... 29946 1726882587.26183: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882587.26191: in run() - task 12673a56-9f93-95e7-9dfb-000000000023 29946 1726882587.26198: variable 'ansible_search_path' from source: unknown 29946 1726882587.26201: variable 'ansible_search_path' from source: unknown 29946 1726882587.26236: calling self._execute() 29946 1726882587.26327: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.26335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.26344: variable 'omit' from source: magic vars 29946 1726882587.26750: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.26756: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882587.26888: variable 'network_provider' from source: set_fact 29946 1726882587.26892: variable 'network_state' from source: role '' defaults 29946 1726882587.26903: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29946 1726882587.26909: variable 'omit' from source: magic vars 29946 1726882587.26970: variable 'omit' from source: magic vars 29946 1726882587.27045: variable 'network_service_name' from source: role '' defaults 29946 1726882587.27059: variable 'network_service_name' from source: role '' defaults 29946 1726882587.27318: variable '__network_provider_setup' from source: role '' defaults 29946 1726882587.27321: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882587.27324: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882587.27326: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882587.27329: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882587.27512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882587.30382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882587.30442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882587.30492: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882587.30532: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882587.30656: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882587.30954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.30995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.31097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.31279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.31303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.31391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.31415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.31437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.31563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.31581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.31916: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882587.32030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.32058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.32084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.32199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.32203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.32540: variable 'ansible_python' from source: facts 29946 1726882587.32561: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882587.32876: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882587.32934: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882587.33254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.33258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.33260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.33345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.33359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.33444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882587.33498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882587.33501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.33526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882587.33539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882587.33999: variable 'network_connections' from source: task vars 29946 1726882587.34004: variable 'interface' from source: set_fact 29946 1726882587.34007: variable 'interface' from source: set_fact 29946 1726882587.34135: variable 'interface' from source: set_fact 29946 1726882587.34316: variable 'interface' from source: set_fact 29946 1726882587.34759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882587.35067: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882587.35070: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882587.35084: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882587.35130: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882587.35189: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882587.35222: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882587.35253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882587.35285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882587.35333: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882587.35609: variable 'network_connections' from source: task vars 29946 1726882587.35614: variable 'interface' from source: set_fact 29946 1726882587.35689: variable 'interface' from source: set_fact 29946 1726882587.35719: variable 'interface' from source: set_fact 29946 1726882587.35771: variable 'interface' from source: set_fact 29946 1726882587.35936: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882587.36046: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882587.36311: variable 'network_connections' from source: task vars 29946 1726882587.36315: variable 'interface' from source: set_fact 29946 1726882587.36381: variable 'interface' from source: set_fact 29946 1726882587.36390: variable 'interface' from source: set_fact 29946 1726882587.36481: variable 'interface' from source: set_fact 29946 1726882587.36504: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882587.36592: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882587.37310: variable 'network_connections' from source: task vars 29946 1726882587.37349: variable 'interface' from source: set_fact 29946 1726882587.37390: variable 'interface' from source: set_fact 29946 1726882587.37395: variable 'interface' from source: set_fact 29946 1726882587.37650: variable 'interface' from source: set_fact 29946 1726882587.37731: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882587.37902: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882587.37908: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882587.37965: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882587.38499: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882587.39516: variable 'network_connections' from source: task vars 29946 1726882587.39519: variable 'interface' from source: set_fact 29946 1726882587.39521: variable 'interface' from source: set_fact 29946 1726882587.39523: variable 'interface' from source: set_fact 29946 1726882587.39550: variable 'interface' from source: set_fact 29946 1726882587.39577: variable 'ansible_distribution' from source: facts 29946 1726882587.39580: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.39590: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.39698: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882587.39789: variable 'ansible_distribution' from source: facts 29946 1726882587.39792: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.39797: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.39808: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882587.40058: variable 'ansible_distribution' from source: facts 29946 1726882587.40061: variable '__network_rh_distros' from source: role '' defaults 29946 1726882587.40063: variable 'ansible_distribution_major_version' from source: facts 29946 1726882587.40065: variable 'network_provider' from source: set_fact 29946 1726882587.40067: variable 'omit' from source: magic vars 29946 1726882587.40072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882587.40101: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882587.40119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882587.40136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882587.40146: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882587.40179: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882587.40182: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.40185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.40483: Set connection var ansible_pipelining to False 29946 1726882587.40702: Set connection var ansible_shell_executable to /bin/sh 29946 1726882587.40705: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882587.40707: Set connection var ansible_timeout to 10 29946 1726882587.40710: Set connection var ansible_shell_type to sh 29946 1726882587.40712: Set connection var ansible_connection to ssh 29946 1726882587.40714: variable 'ansible_shell_executable' from source: unknown 29946 1726882587.40716: variable 'ansible_connection' from source: unknown 29946 1726882587.40718: variable 'ansible_module_compression' from source: unknown 29946 1726882587.40720: variable 'ansible_shell_type' from source: unknown 29946 1726882587.40722: variable 'ansible_shell_executable' from source: unknown 29946 1726882587.40724: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882587.40729: variable 'ansible_pipelining' from source: unknown 29946 1726882587.40731: variable 'ansible_timeout' from source: unknown 29946 1726882587.40733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882587.40736: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882587.40738: variable 'omit' from source: magic vars 29946 1726882587.40740: starting attempt loop 29946 1726882587.40743: running the handler 29946 1726882587.41198: variable 'ansible_facts' from source: unknown 29946 1726882587.42302: _low_level_execute_command(): starting 29946 1726882587.42309: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882587.43811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882587.43918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882587.43924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882587.44020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882587.45705: stdout chunk (state=3): >>>/root <<< 29946 1726882587.45976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882587.46040: stderr chunk (state=3): >>><<< 29946 1726882587.46046: stdout chunk (state=3): >>><<< 29946 1726882587.46070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882587.46082: _low_level_execute_command(): starting 29946 1726882587.46090: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544 `" && echo ansible-tmp-1726882587.4606946-30571-27896199700544="` echo /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544 `" ) && sleep 0' 29946 1726882587.47291: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882587.47296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882587.47316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882587.47322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882587.47462: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882587.47525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882587.47619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882587.49523: stdout chunk (state=3): >>>ansible-tmp-1726882587.4606946-30571-27896199700544=/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544 <<< 29946 1726882587.49738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882587.49742: stderr chunk (state=3): >>><<< 29946 1726882587.49744: stdout chunk (state=3): >>><<< 29946 1726882587.49763: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882587.4606946-30571-27896199700544=/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882587.49813: variable 'ansible_module_compression' from source: unknown 29946 1726882587.49865: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 29946 1726882587.49869: ANSIBALLZ: Acquiring lock 29946 1726882587.49872: ANSIBALLZ: Lock acquired: 140626579263984 29946 1726882587.49874: ANSIBALLZ: Creating module 29946 1726882587.94766: ANSIBALLZ: Writing module into payload 29946 1726882587.94943: ANSIBALLZ: Writing module 29946 1726882587.94973: ANSIBALLZ: Renaming module 29946 1726882587.94988: ANSIBALLZ: Done creating module 29946 1726882587.95098: variable 'ansible_facts' from source: unknown 29946 1726882587.95369: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py 29946 1726882587.95610: Sending initial data 29946 1726882587.95670: Sent initial data (155 bytes) 29946 1726882587.96398: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882587.96403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882587.96405: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882587.96408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882587.96548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882587.98121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 29946 1726882587.98129: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882587.98182: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882587.98240: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpd1d0dgkw /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py <<< 29946 1726882587.98249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py" <<< 29946 1726882587.98303: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpd1d0dgkw" to remote "/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py" <<< 29946 1726882587.98306: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py" <<< 29946 1726882588.00402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882588.00425: stderr chunk (state=3): >>><<< 29946 1726882588.00428: stdout chunk (state=3): >>><<< 29946 1726882588.00451: done transferring module to remote 29946 1726882588.00460: _low_level_execute_command(): starting 29946 1726882588.00465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/ /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py && sleep 0' 29946 1726882588.00870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882588.00907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.00910: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882588.00913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882588.00915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.00953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882588.00966: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882588.01030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882588.02879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882588.02883: stdout chunk (state=3): >>><<< 29946 1726882588.02888: stderr chunk (state=3): >>><<< 29946 1726882588.03043: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882588.03047: _low_level_execute_command(): starting 29946 1726882588.03049: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/AnsiballZ_systemd.py && sleep 0' 29946 1726882588.03810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882588.03904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.03984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882588.03987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882588.04090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882588.32701: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4677632", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303276544", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1499798000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 29946 1726882588.32707: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29946 1726882588.34601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882588.34614: stderr chunk (state=3): >>><<< 29946 1726882588.34622: stdout chunk (state=3): >>><<< 29946 1726882588.34646: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4677632", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3303276544", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1499798000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882588.34952: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882588.34956: _low_level_execute_command(): starting 29946 1726882588.34958: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882587.4606946-30571-27896199700544/ > /dev/null 2>&1 && sleep 0' 29946 1726882588.35510: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882588.35526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882588.35542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882588.35562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882588.35580: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882588.35604: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882588.35622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.35642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882588.35706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.35741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882588.35756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882588.35881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882588.35958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882588.37846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882588.37850: stdout chunk (state=3): >>><<< 29946 1726882588.37857: stderr chunk (state=3): >>><<< 29946 1726882588.37872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882588.37881: handler run complete 29946 1726882588.37968: attempt loop complete, returning result 29946 1726882588.37971: _execute() done 29946 1726882588.37973: dumping result to json 29946 1726882588.37991: done dumping result, returning 29946 1726882588.38004: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-95e7-9dfb-000000000023] 29946 1726882588.38006: sending task result for task 12673a56-9f93-95e7-9dfb-000000000023 29946 1726882588.38363: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000023 29946 1726882588.38367: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882588.38420: no more pending results, returning what we have 29946 1726882588.38423: results queue empty 29946 1726882588.38424: checking for any_errors_fatal 29946 1726882588.38429: done checking for any_errors_fatal 29946 1726882588.38429: checking for max_fail_percentage 29946 1726882588.38431: done checking for max_fail_percentage 29946 1726882588.38432: checking to see if all hosts have failed and the running result is not ok 29946 1726882588.38432: done checking to see if all hosts have failed 29946 1726882588.38433: getting the remaining hosts for this loop 29946 1726882588.38434: done getting the remaining hosts for this loop 29946 1726882588.38438: getting the next task for host managed_node2 29946 1726882588.38445: done getting next task for host managed_node2 29946 1726882588.38448: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882588.38451: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882588.38461: getting variables 29946 1726882588.38463: in VariableManager get_vars() 29946 1726882588.38630: Calling all_inventory to load vars for managed_node2 29946 1726882588.38633: Calling groups_inventory to load vars for managed_node2 29946 1726882588.38636: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882588.38647: Calling all_plugins_play to load vars for managed_node2 29946 1726882588.38650: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882588.38653: Calling groups_plugins_play to load vars for managed_node2 29946 1726882588.40950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882588.42671: done with get_vars() 29946 1726882588.42691: done getting variables 29946 1726882588.42748: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:28 -0400 (0:00:01.173) 0:00:14.537 ****** 29946 1726882588.42782: entering _queue_task() for managed_node2/service 29946 1726882588.43456: worker is 1 (out of 1 available) 29946 1726882588.43468: exiting _queue_task() for managed_node2/service 29946 1726882588.43480: done queuing things up, now waiting for results queue to drain 29946 1726882588.43482: waiting for pending results... 29946 1726882588.44109: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882588.44114: in run() - task 12673a56-9f93-95e7-9dfb-000000000024 29946 1726882588.44117: variable 'ansible_search_path' from source: unknown 29946 1726882588.44120: variable 'ansible_search_path' from source: unknown 29946 1726882588.44403: calling self._execute() 29946 1726882588.44510: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.44517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.44605: variable 'omit' from source: magic vars 29946 1726882588.45252: variable 'ansible_distribution_major_version' from source: facts 29946 1726882588.45264: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882588.45702: variable 'network_provider' from source: set_fact 29946 1726882588.45705: Evaluated conditional (network_provider == "nm"): True 29946 1726882588.45708: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882588.45880: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882588.46313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882588.49422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882588.49503: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882588.49549: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882588.49597: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882588.49631: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882588.49732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882588.49771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882588.49840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882588.50118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882588.50154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882588.50398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882588.50402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882588.50404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882588.50406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882588.50409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882588.50411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882588.50414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882588.50441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882588.50484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882588.50511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882588.50661: variable 'network_connections' from source: task vars 29946 1726882588.50680: variable 'interface' from source: set_fact 29946 1726882588.50763: variable 'interface' from source: set_fact 29946 1726882588.50778: variable 'interface' from source: set_fact 29946 1726882588.50846: variable 'interface' from source: set_fact 29946 1726882588.50950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882588.51130: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882588.51171: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882588.51271: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882588.51323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882588.51400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882588.51457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882588.51519: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882588.51598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882588.51615: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882588.51879: variable 'network_connections' from source: task vars 29946 1726882588.51894: variable 'interface' from source: set_fact 29946 1726882588.51961: variable 'interface' from source: set_fact 29946 1726882588.51973: variable 'interface' from source: set_fact 29946 1726882588.52034: variable 'interface' from source: set_fact 29946 1726882588.52162: Evaluated conditional (__network_wpa_supplicant_required): False 29946 1726882588.52166: when evaluation is False, skipping this task 29946 1726882588.52168: _execute() done 29946 1726882588.52179: dumping result to json 29946 1726882588.52181: done dumping result, returning 29946 1726882588.52184: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-95e7-9dfb-000000000024] 29946 1726882588.52189: sending task result for task 12673a56-9f93-95e7-9dfb-000000000024 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29946 1726882588.52307: no more pending results, returning what we have 29946 1726882588.52311: results queue empty 29946 1726882588.52311: checking for any_errors_fatal 29946 1726882588.52334: done checking for any_errors_fatal 29946 1726882588.52334: checking for max_fail_percentage 29946 1726882588.52336: done checking for max_fail_percentage 29946 1726882588.52337: checking to see if all hosts have failed and the running result is not ok 29946 1726882588.52338: done checking to see if all hosts have failed 29946 1726882588.52339: getting the remaining hosts for this loop 29946 1726882588.52340: done getting the remaining hosts for this loop 29946 1726882588.52343: getting the next task for host managed_node2 29946 1726882588.52350: done getting next task for host managed_node2 29946 1726882588.52353: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882588.52356: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882588.52369: getting variables 29946 1726882588.52371: in VariableManager get_vars() 29946 1726882588.52410: Calling all_inventory to load vars for managed_node2 29946 1726882588.52412: Calling groups_inventory to load vars for managed_node2 29946 1726882588.52415: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882588.52427: Calling all_plugins_play to load vars for managed_node2 29946 1726882588.52429: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882588.52432: Calling groups_plugins_play to load vars for managed_node2 29946 1726882588.52972: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000024 29946 1726882588.52975: WORKER PROCESS EXITING 29946 1726882588.54551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882588.57091: done with get_vars() 29946 1726882588.57115: done getting variables 29946 1726882588.57179: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:28 -0400 (0:00:00.144) 0:00:14.681 ****** 29946 1726882588.57215: entering _queue_task() for managed_node2/service 29946 1726882588.57523: worker is 1 (out of 1 available) 29946 1726882588.57534: exiting _queue_task() for managed_node2/service 29946 1726882588.57545: done queuing things up, now waiting for results queue to drain 29946 1726882588.57547: waiting for pending results... 29946 1726882588.57821: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882588.57970: in run() - task 12673a56-9f93-95e7-9dfb-000000000025 29946 1726882588.57994: variable 'ansible_search_path' from source: unknown 29946 1726882588.58014: variable 'ansible_search_path' from source: unknown 29946 1726882588.58063: calling self._execute() 29946 1726882588.58198: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.58202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.58205: variable 'omit' from source: magic vars 29946 1726882588.58589: variable 'ansible_distribution_major_version' from source: facts 29946 1726882588.58610: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882588.58731: variable 'network_provider' from source: set_fact 29946 1726882588.58777: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882588.58784: when evaluation is False, skipping this task 29946 1726882588.58788: _execute() done 29946 1726882588.58790: dumping result to json 29946 1726882588.58792: done dumping result, returning 29946 1726882588.58797: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-95e7-9dfb-000000000025] 29946 1726882588.58802: sending task result for task 12673a56-9f93-95e7-9dfb-000000000025 skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882588.59055: no more pending results, returning what we have 29946 1726882588.59060: results queue empty 29946 1726882588.59061: checking for any_errors_fatal 29946 1726882588.59069: done checking for any_errors_fatal 29946 1726882588.59070: checking for max_fail_percentage 29946 1726882588.59072: done checking for max_fail_percentage 29946 1726882588.59073: checking to see if all hosts have failed and the running result is not ok 29946 1726882588.59074: done checking to see if all hosts have failed 29946 1726882588.59074: getting the remaining hosts for this loop 29946 1726882588.59076: done getting the remaining hosts for this loop 29946 1726882588.59079: getting the next task for host managed_node2 29946 1726882588.59089: done getting next task for host managed_node2 29946 1726882588.59099: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882588.59103: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882588.59289: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000025 29946 1726882588.59292: WORKER PROCESS EXITING 29946 1726882588.59306: getting variables 29946 1726882588.59307: in VariableManager get_vars() 29946 1726882588.59339: Calling all_inventory to load vars for managed_node2 29946 1726882588.59341: Calling groups_inventory to load vars for managed_node2 29946 1726882588.59343: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882588.59351: Calling all_plugins_play to load vars for managed_node2 29946 1726882588.59354: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882588.59356: Calling groups_plugins_play to load vars for managed_node2 29946 1726882588.61125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882588.63485: done with get_vars() 29946 1726882588.63561: done getting variables 29946 1726882588.63622: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:28 -0400 (0:00:00.064) 0:00:14.746 ****** 29946 1726882588.63656: entering _queue_task() for managed_node2/copy 29946 1726882588.64251: worker is 1 (out of 1 available) 29946 1726882588.64270: exiting _queue_task() for managed_node2/copy 29946 1726882588.64282: done queuing things up, now waiting for results queue to drain 29946 1726882588.64283: waiting for pending results... 29946 1726882588.64715: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882588.65201: in run() - task 12673a56-9f93-95e7-9dfb-000000000026 29946 1726882588.65205: variable 'ansible_search_path' from source: unknown 29946 1726882588.65209: variable 'ansible_search_path' from source: unknown 29946 1726882588.65211: calling self._execute() 29946 1726882588.65449: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.65526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.65529: variable 'omit' from source: magic vars 29946 1726882588.65936: variable 'ansible_distribution_major_version' from source: facts 29946 1726882588.65960: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882588.66397: variable 'network_provider' from source: set_fact 29946 1726882588.66404: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882588.66407: when evaluation is False, skipping this task 29946 1726882588.66409: _execute() done 29946 1726882588.66412: dumping result to json 29946 1726882588.66415: done dumping result, returning 29946 1726882588.66426: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-95e7-9dfb-000000000026] 29946 1726882588.66429: sending task result for task 12673a56-9f93-95e7-9dfb-000000000026 29946 1726882588.66623: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000026 29946 1726882588.66626: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29946 1726882588.66669: no more pending results, returning what we have 29946 1726882588.66672: results queue empty 29946 1726882588.66673: checking for any_errors_fatal 29946 1726882588.66682: done checking for any_errors_fatal 29946 1726882588.66682: checking for max_fail_percentage 29946 1726882588.66684: done checking for max_fail_percentage 29946 1726882588.66685: checking to see if all hosts have failed and the running result is not ok 29946 1726882588.66688: done checking to see if all hosts have failed 29946 1726882588.66689: getting the remaining hosts for this loop 29946 1726882588.66691: done getting the remaining hosts for this loop 29946 1726882588.66696: getting the next task for host managed_node2 29946 1726882588.66703: done getting next task for host managed_node2 29946 1726882588.66708: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882588.66711: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882588.66767: getting variables 29946 1726882588.66769: in VariableManager get_vars() 29946 1726882588.66813: Calling all_inventory to load vars for managed_node2 29946 1726882588.66816: Calling groups_inventory to load vars for managed_node2 29946 1726882588.66818: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882588.66829: Calling all_plugins_play to load vars for managed_node2 29946 1726882588.66832: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882588.66930: Calling groups_plugins_play to load vars for managed_node2 29946 1726882588.69227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882588.70786: done with get_vars() 29946 1726882588.70809: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:28 -0400 (0:00:00.072) 0:00:14.818 ****** 29946 1726882588.70890: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882588.70892: Creating lock for fedora.linux_system_roles.network_connections 29946 1726882588.71306: worker is 1 (out of 1 available) 29946 1726882588.71318: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882588.71329: done queuing things up, now waiting for results queue to drain 29946 1726882588.71331: waiting for pending results... 29946 1726882588.71547: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882588.71629: in run() - task 12673a56-9f93-95e7-9dfb-000000000027 29946 1726882588.71643: variable 'ansible_search_path' from source: unknown 29946 1726882588.71647: variable 'ansible_search_path' from source: unknown 29946 1726882588.71688: calling self._execute() 29946 1726882588.71779: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.71788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.71796: variable 'omit' from source: magic vars 29946 1726882588.72144: variable 'ansible_distribution_major_version' from source: facts 29946 1726882588.72155: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882588.72162: variable 'omit' from source: magic vars 29946 1726882588.72214: variable 'omit' from source: magic vars 29946 1726882588.72448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882588.74488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882588.74555: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882588.74592: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882588.74630: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882588.74658: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882588.74807: variable 'network_provider' from source: set_fact 29946 1726882588.74944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882588.74948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882588.74951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882588.74975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882588.74991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882588.75053: variable 'omit' from source: magic vars 29946 1726882588.75366: variable 'omit' from source: magic vars 29946 1726882588.75370: variable 'network_connections' from source: task vars 29946 1726882588.75372: variable 'interface' from source: set_fact 29946 1726882588.75374: variable 'interface' from source: set_fact 29946 1726882588.75376: variable 'interface' from source: set_fact 29946 1726882588.75391: variable 'interface' from source: set_fact 29946 1726882588.75736: variable 'omit' from source: magic vars 29946 1726882588.75744: variable '__lsr_ansible_managed' from source: task vars 29946 1726882588.75802: variable '__lsr_ansible_managed' from source: task vars 29946 1726882588.76071: Loaded config def from plugin (lookup/template) 29946 1726882588.76074: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29946 1726882588.76105: File lookup term: get_ansible_managed.j2 29946 1726882588.76109: variable 'ansible_search_path' from source: unknown 29946 1726882588.76112: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29946 1726882588.76124: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29946 1726882588.76142: variable 'ansible_search_path' from source: unknown 29946 1726882588.82515: variable 'ansible_managed' from source: unknown 29946 1726882588.82648: variable 'omit' from source: magic vars 29946 1726882588.82765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882588.82768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882588.82770: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882588.82772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882588.82774: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882588.82795: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882588.82802: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.82808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.82916: Set connection var ansible_pipelining to False 29946 1726882588.82928: Set connection var ansible_shell_executable to /bin/sh 29946 1726882588.82940: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882588.82950: Set connection var ansible_timeout to 10 29946 1726882588.82960: Set connection var ansible_shell_type to sh 29946 1726882588.82967: Set connection var ansible_connection to ssh 29946 1726882588.83006: variable 'ansible_shell_executable' from source: unknown 29946 1726882588.83014: variable 'ansible_connection' from source: unknown 29946 1726882588.83095: variable 'ansible_module_compression' from source: unknown 29946 1726882588.83098: variable 'ansible_shell_type' from source: unknown 29946 1726882588.83103: variable 'ansible_shell_executable' from source: unknown 29946 1726882588.83105: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882588.83107: variable 'ansible_pipelining' from source: unknown 29946 1726882588.83109: variable 'ansible_timeout' from source: unknown 29946 1726882588.83111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882588.83202: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882588.83335: variable 'omit' from source: magic vars 29946 1726882588.83338: starting attempt loop 29946 1726882588.83341: running the handler 29946 1726882588.83343: _low_level_execute_command(): starting 29946 1726882588.83346: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882588.84117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882588.84140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882588.84156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882588.84263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882588.84291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882588.84312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882588.84348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882588.84451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882588.86161: stdout chunk (state=3): >>>/root <<< 29946 1726882588.86399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882588.86402: stdout chunk (state=3): >>><<< 29946 1726882588.86405: stderr chunk (state=3): >>><<< 29946 1726882588.86407: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882588.86409: _low_level_execute_command(): starting 29946 1726882588.86412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612 `" && echo ansible-tmp-1726882588.863343-30637-158842290299612="` echo /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612 `" ) && sleep 0' 29946 1726882588.87002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882588.87017: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882588.87029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882588.87046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882588.87062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882588.87083: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882588.87190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882588.87213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882588.87228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882588.87322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882588.89224: stdout chunk (state=3): >>>ansible-tmp-1726882588.863343-30637-158842290299612=/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612 <<< 29946 1726882588.89378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882588.89381: stdout chunk (state=3): >>><<< 29946 1726882588.89383: stderr chunk (state=3): >>><<< 29946 1726882588.89504: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882588.863343-30637-158842290299612=/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882588.89508: variable 'ansible_module_compression' from source: unknown 29946 1726882588.89519: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 29946 1726882588.89534: ANSIBALLZ: Acquiring lock 29946 1726882588.89542: ANSIBALLZ: Lock acquired: 140626577275648 29946 1726882588.89550: ANSIBALLZ: Creating module 29946 1726882589.03903: ANSIBALLZ: Writing module into payload 29946 1726882589.04127: ANSIBALLZ: Writing module 29946 1726882589.04144: ANSIBALLZ: Renaming module 29946 1726882589.04150: ANSIBALLZ: Done creating module 29946 1726882589.04170: variable 'ansible_facts' from source: unknown 29946 1726882589.04238: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py 29946 1726882589.04340: Sending initial data 29946 1726882589.04348: Sent initial data (167 bytes) 29946 1726882589.04771: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.04774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.04777: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.04779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.04831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882589.04834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882589.04840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882589.04907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882589.06499: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882589.06501: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882589.06551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882589.06621: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpfg561wns /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py <<< 29946 1726882589.06624: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py" <<< 29946 1726882589.06679: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpfg561wns" to remote "/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py" <<< 29946 1726882589.06682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py" <<< 29946 1726882589.07463: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882589.07504: stderr chunk (state=3): >>><<< 29946 1726882589.07508: stdout chunk (state=3): >>><<< 29946 1726882589.07545: done transferring module to remote 29946 1726882589.07554: _low_level_execute_command(): starting 29946 1726882589.07559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/ /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py && sleep 0' 29946 1726882589.07957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.07965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882589.07989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.07992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.07997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.08047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882589.08050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882589.08116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882589.09860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882589.09881: stderr chunk (state=3): >>><<< 29946 1726882589.09884: stdout chunk (state=3): >>><<< 29946 1726882589.09900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882589.09903: _low_level_execute_command(): starting 29946 1726882589.09907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/AnsiballZ_network_connections.py && sleep 0' 29946 1726882589.10317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882589.10320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882589.10322: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.10324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.10330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.10367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882589.10390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882589.10448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882589.54023: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29946 1726882589.56601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882589.56605: stdout chunk (state=3): >>><<< 29946 1726882589.56607: stderr chunk (state=3): >>><<< 29946 1726882589.56610: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"dhcp4": false, "address": ["198.51.100.3/26", "2001:db8::2/32"], "route": [{"network": "198.51.100.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4, "table": 30200}, {"network": "198.51.100.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2, "table": 30400}, {"network": "2001:db8::4", "prefix": 32, "gateway": "2001:db8::1", "metric": 2, "table": 30600}], "routing_rule": [{"priority": 30200, "from": "198.51.100.58/26", "table": 30200}, {"priority": 30201, "family": "ipv4", "fwmark": 1, "fwmask": 1, "table": 30200}, {"priority": 30202, "family": "ipv4", "ipproto": 6, "table": 30200}, {"priority": 30203, "family": "ipv4", "sport": "128 - 256", "table": 30200}, {"priority": 30204, "family": "ipv4", "tos": 8, "table": 30200}, {"priority": 30400, "to": "198.51.100.128/26", "table": 30400}, {"priority": 30401, "family": "ipv4", "iif": "iiftest", "table": 30400}, {"priority": 30402, "family": "ipv4", "oif": "oiftest", "table": 30400}, {"priority": 30403, "from": "0.0.0.0/0", "to": "0.0.0.0/0", "table": 30400}, {"priority": 30600, "to": "2001:db8::4/32", "table": 30600}, {"priority": 30601, "family": "ipv6", "dport": "128 - 256", "invert": true, "table": 30600}, {"priority": 30602, "from": "::/0", "to": "::/0", "table": 30600}, {"priority": 200, "from": "198.51.100.56/26", "table": "custom"}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882589.56616: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'dhcp4': False, 'address': ['198.51.100.3/26', '2001:db8::2/32'], 'route': [{'network': '198.51.100.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4, 'table': 30200}, {'network': '198.51.100.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2, 'table': 30400}, {'network': '2001:db8::4', 'prefix': 32, 'gateway': '2001:db8::1', 'metric': 2, 'table': 30600}], 'routing_rule': [{'priority': 30200, 'from': '198.51.100.58/26', 'table': 30200}, {'priority': 30201, 'family': 'ipv4', 'fwmark': 1, 'fwmask': 1, 'table': 30200}, {'priority': 30202, 'family': 'ipv4', 'ipproto': 6, 'table': 30200}, {'priority': 30203, 'family': 'ipv4', 'sport': '128 - 256', 'table': 30200}, {'priority': 30204, 'family': 'ipv4', 'tos': 8, 'table': 30200}, {'priority': 30400, 'to': '198.51.100.128/26', 'table': 30400}, {'priority': 30401, 'family': 'ipv4', 'iif': 'iiftest', 'table': 30400}, {'priority': 30402, 'family': 'ipv4', 'oif': 'oiftest', 'table': 30400}, {'priority': 30403, 'from': '0.0.0.0/0', 'to': '0.0.0.0/0', 'table': 30400}, {'priority': 30600, 'to': '2001:db8::4/32', 'table': 30600}, {'priority': 30601, 'family': 'ipv6', 'dport': '128 - 256', 'invert': True, 'table': 30600}, {'priority': 30602, 'from': '::/0', 'to': '::/0', 'table': 30600}, {'priority': 200, 'from': '198.51.100.56/26', 'table': 'custom'}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882589.56618: _low_level_execute_command(): starting 29946 1726882589.56620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882588.863343-30637-158842290299612/ > /dev/null 2>&1 && sleep 0' 29946 1726882589.57620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882589.57624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882589.57640: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882589.57643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.57654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882589.57665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882589.57679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29946 1726882589.57695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882589.57788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882589.57804: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882589.57828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882589.57925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882589.59784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882589.59787: stdout chunk (state=3): >>><<< 29946 1726882589.59790: stderr chunk (state=3): >>><<< 29946 1726882589.59808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882589.59908: handler run complete 29946 1726882589.60198: attempt loop complete, returning result 29946 1726882589.60201: _execute() done 29946 1726882589.60204: dumping result to json 29946 1726882589.60206: done dumping result, returning 29946 1726882589.60209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-95e7-9dfb-000000000027] 29946 1726882589.60212: sending task result for task 12673a56-9f93-95e7-9dfb-000000000027 29946 1726882589.60316: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000027 29946 1726882589.60319: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 [004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active) 29946 1726882589.60809: no more pending results, returning what we have 29946 1726882589.60814: results queue empty 29946 1726882589.60815: checking for any_errors_fatal 29946 1726882589.60820: done checking for any_errors_fatal 29946 1726882589.60821: checking for max_fail_percentage 29946 1726882589.60822: done checking for max_fail_percentage 29946 1726882589.60823: checking to see if all hosts have failed and the running result is not ok 29946 1726882589.60824: done checking to see if all hosts have failed 29946 1726882589.60825: getting the remaining hosts for this loop 29946 1726882589.60826: done getting the remaining hosts for this loop 29946 1726882589.60830: getting the next task for host managed_node2 29946 1726882589.60836: done getting next task for host managed_node2 29946 1726882589.60840: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882589.60842: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882589.60853: getting variables 29946 1726882589.60855: in VariableManager get_vars() 29946 1726882589.60894: Calling all_inventory to load vars for managed_node2 29946 1726882589.60897: Calling groups_inventory to load vars for managed_node2 29946 1726882589.60899: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882589.60908: Calling all_plugins_play to load vars for managed_node2 29946 1726882589.60910: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882589.60913: Calling groups_plugins_play to load vars for managed_node2 29946 1726882589.62473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882589.65916: done with get_vars() 29946 1726882589.65941: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:29 -0400 (0:00:00.953) 0:00:15.771 ****** 29946 1726882589.66225: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882589.66227: Creating lock for fedora.linux_system_roles.network_state 29946 1726882589.66818: worker is 1 (out of 1 available) 29946 1726882589.66830: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882589.66841: done queuing things up, now waiting for results queue to drain 29946 1726882589.66842: waiting for pending results... 29946 1726882589.67255: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882589.67406: in run() - task 12673a56-9f93-95e7-9dfb-000000000028 29946 1726882589.67512: variable 'ansible_search_path' from source: unknown 29946 1726882589.67516: variable 'ansible_search_path' from source: unknown 29946 1726882589.67518: calling self._execute() 29946 1726882589.67563: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.67578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.67595: variable 'omit' from source: magic vars 29946 1726882589.67987: variable 'ansible_distribution_major_version' from source: facts 29946 1726882589.68008: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882589.68609: variable 'network_state' from source: role '' defaults 29946 1726882589.68612: Evaluated conditional (network_state != {}): False 29946 1726882589.68615: when evaluation is False, skipping this task 29946 1726882589.68618: _execute() done 29946 1726882589.68620: dumping result to json 29946 1726882589.68623: done dumping result, returning 29946 1726882589.68625: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-95e7-9dfb-000000000028] 29946 1726882589.68627: sending task result for task 12673a56-9f93-95e7-9dfb-000000000028 29946 1726882589.68696: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000028 29946 1726882589.68700: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882589.68765: no more pending results, returning what we have 29946 1726882589.68769: results queue empty 29946 1726882589.68770: checking for any_errors_fatal 29946 1726882589.68797: done checking for any_errors_fatal 29946 1726882589.68799: checking for max_fail_percentage 29946 1726882589.68801: done checking for max_fail_percentage 29946 1726882589.68802: checking to see if all hosts have failed and the running result is not ok 29946 1726882589.68803: done checking to see if all hosts have failed 29946 1726882589.68804: getting the remaining hosts for this loop 29946 1726882589.68805: done getting the remaining hosts for this loop 29946 1726882589.68866: getting the next task for host managed_node2 29946 1726882589.68875: done getting next task for host managed_node2 29946 1726882589.68879: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882589.68883: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882589.69103: getting variables 29946 1726882589.69105: in VariableManager get_vars() 29946 1726882589.69139: Calling all_inventory to load vars for managed_node2 29946 1726882589.69142: Calling groups_inventory to load vars for managed_node2 29946 1726882589.69144: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882589.69154: Calling all_plugins_play to load vars for managed_node2 29946 1726882589.69156: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882589.69159: Calling groups_plugins_play to load vars for managed_node2 29946 1726882589.72267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882589.75670: done with get_vars() 29946 1726882589.75707: done getting variables 29946 1726882589.75769: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:29 -0400 (0:00:00.095) 0:00:15.867 ****** 29946 1726882589.75813: entering _queue_task() for managed_node2/debug 29946 1726882589.76095: worker is 1 (out of 1 available) 29946 1726882589.76109: exiting _queue_task() for managed_node2/debug 29946 1726882589.76121: done queuing things up, now waiting for results queue to drain 29946 1726882589.76123: waiting for pending results... 29946 1726882589.76600: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882589.76608: in run() - task 12673a56-9f93-95e7-9dfb-000000000029 29946 1726882589.76613: variable 'ansible_search_path' from source: unknown 29946 1726882589.76616: variable 'ansible_search_path' from source: unknown 29946 1726882589.76645: calling self._execute() 29946 1726882589.76905: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.76910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.76914: variable 'omit' from source: magic vars 29946 1726882589.77128: variable 'ansible_distribution_major_version' from source: facts 29946 1726882589.77144: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882589.77150: variable 'omit' from source: magic vars 29946 1726882589.77232: variable 'omit' from source: magic vars 29946 1726882589.77280: variable 'omit' from source: magic vars 29946 1726882589.77399: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882589.77432: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882589.77450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882589.77515: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.77560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.77616: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882589.77625: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.77633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.77750: Set connection var ansible_pipelining to False 29946 1726882589.77771: Set connection var ansible_shell_executable to /bin/sh 29946 1726882589.77800: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882589.77803: Set connection var ansible_timeout to 10 29946 1726882589.77805: Set connection var ansible_shell_type to sh 29946 1726882589.77807: Set connection var ansible_connection to ssh 29946 1726882589.77832: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.77881: variable 'ansible_connection' from source: unknown 29946 1726882589.77884: variable 'ansible_module_compression' from source: unknown 29946 1726882589.77886: variable 'ansible_shell_type' from source: unknown 29946 1726882589.77889: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.77891: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.77895: variable 'ansible_pipelining' from source: unknown 29946 1726882589.77898: variable 'ansible_timeout' from source: unknown 29946 1726882589.77900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.78030: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882589.78048: variable 'omit' from source: magic vars 29946 1726882589.78100: starting attempt loop 29946 1726882589.78104: running the handler 29946 1726882589.78194: variable '__network_connections_result' from source: set_fact 29946 1726882589.78274: handler run complete 29946 1726882589.78298: attempt loop complete, returning result 29946 1726882589.78307: _execute() done 29946 1726882589.78319: dumping result to json 29946 1726882589.78327: done dumping result, returning 29946 1726882589.78399: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-95e7-9dfb-000000000029] 29946 1726882589.78403: sending task result for task 12673a56-9f93-95e7-9dfb-000000000029 29946 1726882589.78478: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000029 29946 1726882589.78482: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active)" ] } 29946 1726882589.78540: no more pending results, returning what we have 29946 1726882589.78544: results queue empty 29946 1726882589.78544: checking for any_errors_fatal 29946 1726882589.78552: done checking for any_errors_fatal 29946 1726882589.78553: checking for max_fail_percentage 29946 1726882589.78554: done checking for max_fail_percentage 29946 1726882589.78555: checking to see if all hosts have failed and the running result is not ok 29946 1726882589.78556: done checking to see if all hosts have failed 29946 1726882589.78557: getting the remaining hosts for this loop 29946 1726882589.78558: done getting the remaining hosts for this loop 29946 1726882589.78562: getting the next task for host managed_node2 29946 1726882589.78568: done getting next task for host managed_node2 29946 1726882589.78571: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882589.78574: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882589.78587: getting variables 29946 1726882589.78589: in VariableManager get_vars() 29946 1726882589.78623: Calling all_inventory to load vars for managed_node2 29946 1726882589.78625: Calling groups_inventory to load vars for managed_node2 29946 1726882589.78627: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882589.78636: Calling all_plugins_play to load vars for managed_node2 29946 1726882589.78639: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882589.78641: Calling groups_plugins_play to load vars for managed_node2 29946 1726882589.80638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882589.83067: done with get_vars() 29946 1726882589.83095: done getting variables 29946 1726882589.83162: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:29 -0400 (0:00:00.073) 0:00:15.941 ****** 29946 1726882589.83201: entering _queue_task() for managed_node2/debug 29946 1726882589.83608: worker is 1 (out of 1 available) 29946 1726882589.83623: exiting _queue_task() for managed_node2/debug 29946 1726882589.83641: done queuing things up, now waiting for results queue to drain 29946 1726882589.83643: waiting for pending results... 29946 1726882589.84210: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882589.84356: in run() - task 12673a56-9f93-95e7-9dfb-00000000002a 29946 1726882589.84484: variable 'ansible_search_path' from source: unknown 29946 1726882589.84487: variable 'ansible_search_path' from source: unknown 29946 1726882589.84575: calling self._execute() 29946 1726882589.84630: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.84637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.84667: variable 'omit' from source: magic vars 29946 1726882589.85351: variable 'ansible_distribution_major_version' from source: facts 29946 1726882589.85364: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882589.85370: variable 'omit' from source: magic vars 29946 1726882589.85634: variable 'omit' from source: magic vars 29946 1726882589.85670: variable 'omit' from source: magic vars 29946 1726882589.85713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882589.85775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882589.85778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882589.85781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.86046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.86102: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882589.86105: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.86108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.86187: Set connection var ansible_pipelining to False 29946 1726882589.86197: Set connection var ansible_shell_executable to /bin/sh 29946 1726882589.86427: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882589.86430: Set connection var ansible_timeout to 10 29946 1726882589.86432: Set connection var ansible_shell_type to sh 29946 1726882589.86434: Set connection var ansible_connection to ssh 29946 1726882589.86456: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.86459: variable 'ansible_connection' from source: unknown 29946 1726882589.86466: variable 'ansible_module_compression' from source: unknown 29946 1726882589.86469: variable 'ansible_shell_type' from source: unknown 29946 1726882589.86471: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.86473: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.86476: variable 'ansible_pipelining' from source: unknown 29946 1726882589.86478: variable 'ansible_timeout' from source: unknown 29946 1726882589.86480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.86637: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882589.86688: variable 'omit' from source: magic vars 29946 1726882589.86691: starting attempt loop 29946 1726882589.86695: running the handler 29946 1726882589.86712: variable '__network_connections_result' from source: set_fact 29946 1726882589.86796: variable '__network_connections_result' from source: set_fact 29946 1726882589.87191: handler run complete 29946 1726882589.87245: attempt loop complete, returning result 29946 1726882589.87249: _execute() done 29946 1726882589.87251: dumping result to json 29946 1726882589.87333: done dumping result, returning 29946 1726882589.87337: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-95e7-9dfb-00000000002a] 29946 1726882589.87339: sending task result for task 12673a56-9f93-95e7-9dfb-00000000002a 29946 1726882589.87413: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000002a 29946 1726882589.87416: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/26", "2001:db8::2/32" ], "dhcp4": false, "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.100.64", "prefix": 26, "table": 30200 }, { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.100.128", "prefix": 26, "table": 30400 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db8::4", "prefix": 32, "table": 30600 } ], "routing_rule": [ { "from": "198.51.100.58/26", "priority": 30200, "table": 30200 }, { "family": "ipv4", "fwmark": 1, "fwmask": 1, "priority": 30201, "table": 30200 }, { "family": "ipv4", "ipproto": 6, "priority": 30202, "table": 30200 }, { "family": "ipv4", "priority": 30203, "sport": "128 - 256", "table": 30200 }, { "family": "ipv4", "priority": 30204, "table": 30200, "tos": 8 }, { "priority": 30400, "table": 30400, "to": "198.51.100.128/26" }, { "family": "ipv4", "iif": "iiftest", "priority": 30401, "table": 30400 }, { "family": "ipv4", "oif": "oiftest", "priority": 30402, "table": 30400 }, { "from": "0.0.0.0/0", "priority": 30403, "table": 30400, "to": "0.0.0.0/0" }, { "priority": 30600, "table": 30600, "to": "2001:db8::4/32" }, { "dport": "128 - 256", "family": "ipv6", "invert": true, "priority": 30601, "table": 30600 }, { "from": "::/0", "priority": 30602, "table": 30600, "to": "::/0" }, { "from": "198.51.100.56/26", "priority": 200, "table": "custom" } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541\n[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541", "[004] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 40a117ec-4b75-4c1f-bad4-81df3058e541 (not-active)" ] } } 29946 1726882589.87575: no more pending results, returning what we have 29946 1726882589.87578: results queue empty 29946 1726882589.87579: checking for any_errors_fatal 29946 1726882589.87584: done checking for any_errors_fatal 29946 1726882589.87585: checking for max_fail_percentage 29946 1726882589.87586: done checking for max_fail_percentage 29946 1726882589.87587: checking to see if all hosts have failed and the running result is not ok 29946 1726882589.87588: done checking to see if all hosts have failed 29946 1726882589.87589: getting the remaining hosts for this loop 29946 1726882589.87590: done getting the remaining hosts for this loop 29946 1726882589.87664: getting the next task for host managed_node2 29946 1726882589.87671: done getting next task for host managed_node2 29946 1726882589.87676: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882589.87679: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882589.87696: getting variables 29946 1726882589.87698: in VariableManager get_vars() 29946 1726882589.87733: Calling all_inventory to load vars for managed_node2 29946 1726882589.87736: Calling groups_inventory to load vars for managed_node2 29946 1726882589.87738: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882589.87747: Calling all_plugins_play to load vars for managed_node2 29946 1726882589.87749: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882589.87752: Calling groups_plugins_play to load vars for managed_node2 29946 1726882589.90034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882589.91616: done with get_vars() 29946 1726882589.91639: done getting variables 29946 1726882589.91700: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:29 -0400 (0:00:00.085) 0:00:16.026 ****** 29946 1726882589.91734: entering _queue_task() for managed_node2/debug 29946 1726882589.92114: worker is 1 (out of 1 available) 29946 1726882589.92127: exiting _queue_task() for managed_node2/debug 29946 1726882589.92138: done queuing things up, now waiting for results queue to drain 29946 1726882589.92140: waiting for pending results... 29946 1726882589.92384: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882589.92601: in run() - task 12673a56-9f93-95e7-9dfb-00000000002b 29946 1726882589.92605: variable 'ansible_search_path' from source: unknown 29946 1726882589.92608: variable 'ansible_search_path' from source: unknown 29946 1726882589.92611: calling self._execute() 29946 1726882589.92613: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.92616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.92633: variable 'omit' from source: magic vars 29946 1726882589.93004: variable 'ansible_distribution_major_version' from source: facts 29946 1726882589.93017: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882589.93138: variable 'network_state' from source: role '' defaults 29946 1726882589.93157: Evaluated conditional (network_state != {}): False 29946 1726882589.93162: when evaluation is False, skipping this task 29946 1726882589.93168: _execute() done 29946 1726882589.93174: dumping result to json 29946 1726882589.93177: done dumping result, returning 29946 1726882589.93188: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-95e7-9dfb-00000000002b] 29946 1726882589.93192: sending task result for task 12673a56-9f93-95e7-9dfb-00000000002b skipping: [managed_node2] => { "false_condition": "network_state != {}" } 29946 1726882589.93347: no more pending results, returning what we have 29946 1726882589.93350: results queue empty 29946 1726882589.93351: checking for any_errors_fatal 29946 1726882589.93363: done checking for any_errors_fatal 29946 1726882589.93364: checking for max_fail_percentage 29946 1726882589.93366: done checking for max_fail_percentage 29946 1726882589.93367: checking to see if all hosts have failed and the running result is not ok 29946 1726882589.93368: done checking to see if all hosts have failed 29946 1726882589.93368: getting the remaining hosts for this loop 29946 1726882589.93369: done getting the remaining hosts for this loop 29946 1726882589.93373: getting the next task for host managed_node2 29946 1726882589.93379: done getting next task for host managed_node2 29946 1726882589.93383: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882589.93389: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882589.93405: getting variables 29946 1726882589.93406: in VariableManager get_vars() 29946 1726882589.93439: Calling all_inventory to load vars for managed_node2 29946 1726882589.93441: Calling groups_inventory to load vars for managed_node2 29946 1726882589.93443: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882589.93451: Calling all_plugins_play to load vars for managed_node2 29946 1726882589.93453: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882589.93456: Calling groups_plugins_play to load vars for managed_node2 29946 1726882589.94006: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000002b 29946 1726882589.94010: WORKER PROCESS EXITING 29946 1726882589.94389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882589.95344: done with get_vars() 29946 1726882589.95365: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:29 -0400 (0:00:00.037) 0:00:16.064 ****** 29946 1726882589.95452: entering _queue_task() for managed_node2/ping 29946 1726882589.95453: Creating lock for ping 29946 1726882589.95734: worker is 1 (out of 1 available) 29946 1726882589.95747: exiting _queue_task() for managed_node2/ping 29946 1726882589.95758: done queuing things up, now waiting for results queue to drain 29946 1726882589.95760: waiting for pending results... 29946 1726882589.95979: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882589.96200: in run() - task 12673a56-9f93-95e7-9dfb-00000000002c 29946 1726882589.96204: variable 'ansible_search_path' from source: unknown 29946 1726882589.96207: variable 'ansible_search_path' from source: unknown 29946 1726882589.96210: calling self._execute() 29946 1726882589.96269: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.96283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.96300: variable 'omit' from source: magic vars 29946 1726882589.96679: variable 'ansible_distribution_major_version' from source: facts 29946 1726882589.96697: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882589.96781: variable 'omit' from source: magic vars 29946 1726882589.96785: variable 'omit' from source: magic vars 29946 1726882589.96811: variable 'omit' from source: magic vars 29946 1726882589.96853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882589.96911: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882589.96931: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882589.96956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.96969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882589.97011: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882589.97015: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.97017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.97084: Set connection var ansible_pipelining to False 29946 1726882589.97087: Set connection var ansible_shell_executable to /bin/sh 29946 1726882589.97099: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882589.97106: Set connection var ansible_timeout to 10 29946 1726882589.97108: Set connection var ansible_shell_type to sh 29946 1726882589.97118: Set connection var ansible_connection to ssh 29946 1726882589.97134: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.97137: variable 'ansible_connection' from source: unknown 29946 1726882589.97140: variable 'ansible_module_compression' from source: unknown 29946 1726882589.97142: variable 'ansible_shell_type' from source: unknown 29946 1726882589.97144: variable 'ansible_shell_executable' from source: unknown 29946 1726882589.97146: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882589.97148: variable 'ansible_pipelining' from source: unknown 29946 1726882589.97152: variable 'ansible_timeout' from source: unknown 29946 1726882589.97156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882589.97304: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882589.97316: variable 'omit' from source: magic vars 29946 1726882589.97320: starting attempt loop 29946 1726882589.97323: running the handler 29946 1726882589.97332: _low_level_execute_command(): starting 29946 1726882589.97339: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882589.97811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.97815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.97818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882589.97820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882589.97863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882589.97877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882589.97954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882589.99589: stdout chunk (state=3): >>>/root <<< 29946 1726882589.99675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882589.99703: stderr chunk (state=3): >>><<< 29946 1726882589.99708: stdout chunk (state=3): >>><<< 29946 1726882589.99726: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882589.99736: _low_level_execute_command(): starting 29946 1726882589.99742: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775 `" && echo ansible-tmp-1726882589.9972575-30688-145986236724775="` echo /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775 `" ) && sleep 0' 29946 1726882590.00151: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.00155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.00157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882590.00166: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.00168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.00207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.00213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.00276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.02145: stdout chunk (state=3): >>>ansible-tmp-1726882589.9972575-30688-145986236724775=/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775 <<< 29946 1726882590.02252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.02280: stderr chunk (state=3): >>><<< 29946 1726882590.02282: stdout chunk (state=3): >>><<< 29946 1726882590.02306: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882589.9972575-30688-145986236724775=/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.02338: variable 'ansible_module_compression' from source: unknown 29946 1726882590.02369: ANSIBALLZ: Using lock for ping 29946 1726882590.02372: ANSIBALLZ: Acquiring lock 29946 1726882590.02374: ANSIBALLZ: Lock acquired: 140626576989392 29946 1726882590.02377: ANSIBALLZ: Creating module 29946 1726882590.10700: ANSIBALLZ: Writing module into payload 29946 1726882590.10704: ANSIBALLZ: Writing module 29946 1726882590.10706: ANSIBALLZ: Renaming module 29946 1726882590.10709: ANSIBALLZ: Done creating module 29946 1726882590.10711: variable 'ansible_facts' from source: unknown 29946 1726882590.10713: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py 29946 1726882590.10924: Sending initial data 29946 1726882590.10933: Sent initial data (153 bytes) 29946 1726882590.11508: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.11527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.11544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.11637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.13166: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882590.13242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882590.13328: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp_mqje3d_ /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py <<< 29946 1726882590.13338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py" <<< 29946 1726882590.13390: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp_mqje3d_" to remote "/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py" <<< 29946 1726882590.14166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.14184: stdout chunk (state=3): >>><<< 29946 1726882590.14204: stderr chunk (state=3): >>><<< 29946 1726882590.14262: done transferring module to remote 29946 1726882590.14278: _low_level_execute_command(): starting 29946 1726882590.14297: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/ /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py && sleep 0' 29946 1726882590.14879: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882590.14904: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.15009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.15030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.15045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.15138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.16957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.16960: stdout chunk (state=3): >>><<< 29946 1726882590.16963: stderr chunk (state=3): >>><<< 29946 1726882590.17015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.17023: _low_level_execute_command(): starting 29946 1726882590.17268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/AnsiballZ_ping.py && sleep 0' 29946 1726882590.18100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882590.18157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.18228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.18243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.18271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.18378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.33108: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29946 1726882590.34314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882590.34327: stdout chunk (state=3): >>><<< 29946 1726882590.34369: stderr chunk (state=3): >>><<< 29946 1726882590.34395: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882590.34427: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882590.34443: _low_level_execute_command(): starting 29946 1726882590.34453: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882589.9972575-30688-145986236724775/ > /dev/null 2>&1 && sleep 0' 29946 1726882590.35029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882590.35067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.35090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.35125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882590.35133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882590.35138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.35196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.35217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.35287: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.37127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.37138: stderr chunk (state=3): >>><<< 29946 1726882590.37146: stdout chunk (state=3): >>><<< 29946 1726882590.37303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.37312: handler run complete 29946 1726882590.37315: attempt loop complete, returning result 29946 1726882590.37317: _execute() done 29946 1726882590.37319: dumping result to json 29946 1726882590.37321: done dumping result, returning 29946 1726882590.37323: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-95e7-9dfb-00000000002c] 29946 1726882590.37325: sending task result for task 12673a56-9f93-95e7-9dfb-00000000002c 29946 1726882590.37387: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000002c 29946 1726882590.37391: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 29946 1726882590.37456: no more pending results, returning what we have 29946 1726882590.37460: results queue empty 29946 1726882590.37461: checking for any_errors_fatal 29946 1726882590.37470: done checking for any_errors_fatal 29946 1726882590.37471: checking for max_fail_percentage 29946 1726882590.37472: done checking for max_fail_percentage 29946 1726882590.37473: checking to see if all hosts have failed and the running result is not ok 29946 1726882590.37474: done checking to see if all hosts have failed 29946 1726882590.37475: getting the remaining hosts for this loop 29946 1726882590.37477: done getting the remaining hosts for this loop 29946 1726882590.37481: getting the next task for host managed_node2 29946 1726882590.37497: done getting next task for host managed_node2 29946 1726882590.37500: ^ task is: TASK: meta (role_complete) 29946 1726882590.37504: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882590.37515: getting variables 29946 1726882590.37517: in VariableManager get_vars() 29946 1726882590.37560: Calling all_inventory to load vars for managed_node2 29946 1726882590.37563: Calling groups_inventory to load vars for managed_node2 29946 1726882590.37565: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882590.37576: Calling all_plugins_play to load vars for managed_node2 29946 1726882590.37578: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882590.37582: Calling groups_plugins_play to load vars for managed_node2 29946 1726882590.39647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882590.40840: done with get_vars() 29946 1726882590.40864: done getting variables 29946 1726882590.40943: done queuing things up, now waiting for results queue to drain 29946 1726882590.40945: results queue empty 29946 1726882590.40946: checking for any_errors_fatal 29946 1726882590.40949: done checking for any_errors_fatal 29946 1726882590.40949: checking for max_fail_percentage 29946 1726882590.40950: done checking for max_fail_percentage 29946 1726882590.40954: checking to see if all hosts have failed and the running result is not ok 29946 1726882590.40955: done checking to see if all hosts have failed 29946 1726882590.40956: getting the remaining hosts for this loop 29946 1726882590.40957: done getting the remaining hosts for this loop 29946 1726882590.40961: getting the next task for host managed_node2 29946 1726882590.40964: done getting next task for host managed_node2 29946 1726882590.40967: ^ task is: TASK: Get the routing rule for looking up the table 30200 29946 1726882590.40968: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882590.40970: getting variables 29946 1726882590.40971: in VariableManager get_vars() 29946 1726882590.40986: Calling all_inventory to load vars for managed_node2 29946 1726882590.40988: Calling groups_inventory to load vars for managed_node2 29946 1726882590.40990: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882590.40997: Calling all_plugins_play to load vars for managed_node2 29946 1726882590.40999: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882590.41001: Calling groups_plugins_play to load vars for managed_node2 29946 1726882590.42469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882590.44162: done with get_vars() 29946 1726882590.44182: done getting variables 29946 1726882590.44216: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30200] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:115 Friday 20 September 2024 21:36:30 -0400 (0:00:00.487) 0:00:16.551 ****** 29946 1726882590.44236: entering _queue_task() for managed_node2/command 29946 1726882590.44530: worker is 1 (out of 1 available) 29946 1726882590.44544: exiting _queue_task() for managed_node2/command 29946 1726882590.44555: done queuing things up, now waiting for results queue to drain 29946 1726882590.44556: waiting for pending results... 29946 1726882590.44710: running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30200 29946 1726882590.44786: in run() - task 12673a56-9f93-95e7-9dfb-00000000005c 29946 1726882590.44805: variable 'ansible_search_path' from source: unknown 29946 1726882590.44839: calling self._execute() 29946 1726882590.44912: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.44920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.44938: variable 'omit' from source: magic vars 29946 1726882590.45305: variable 'ansible_distribution_major_version' from source: facts 29946 1726882590.45308: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882590.45399: variable 'ansible_distribution_major_version' from source: facts 29946 1726882590.45422: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882590.45433: variable 'omit' from source: magic vars 29946 1726882590.45498: variable 'omit' from source: magic vars 29946 1726882590.45504: variable 'omit' from source: magic vars 29946 1726882590.45563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882590.45608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882590.45641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882590.45662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882590.45683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882590.45738: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882590.45848: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.45851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.45885: Set connection var ansible_pipelining to False 29946 1726882590.45903: Set connection var ansible_shell_executable to /bin/sh 29946 1726882590.45913: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882590.45923: Set connection var ansible_timeout to 10 29946 1726882590.45934: Set connection var ansible_shell_type to sh 29946 1726882590.45941: Set connection var ansible_connection to ssh 29946 1726882590.45978: variable 'ansible_shell_executable' from source: unknown 29946 1726882590.45990: variable 'ansible_connection' from source: unknown 29946 1726882590.46001: variable 'ansible_module_compression' from source: unknown 29946 1726882590.46013: variable 'ansible_shell_type' from source: unknown 29946 1726882590.46021: variable 'ansible_shell_executable' from source: unknown 29946 1726882590.46067: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.46070: variable 'ansible_pipelining' from source: unknown 29946 1726882590.46073: variable 'ansible_timeout' from source: unknown 29946 1726882590.46075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.46215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882590.46232: variable 'omit' from source: magic vars 29946 1726882590.46242: starting attempt loop 29946 1726882590.46248: running the handler 29946 1726882590.46284: _low_level_execute_command(): starting 29946 1726882590.46289: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882590.46998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.47002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.47005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882590.47008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.47062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.47065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.47128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.48711: stdout chunk (state=3): >>>/root <<< 29946 1726882590.48811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.48841: stderr chunk (state=3): >>><<< 29946 1726882590.48844: stdout chunk (state=3): >>><<< 29946 1726882590.48861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.48903: _low_level_execute_command(): starting 29946 1726882590.48907: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008 `" && echo ansible-tmp-1726882590.4886763-30716-168475619066008="` echo /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008 `" ) && sleep 0' 29946 1726882590.49263: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.49300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882590.49304: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882590.49313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.49316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.49318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.49357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.49360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.49425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.51284: stdout chunk (state=3): >>>ansible-tmp-1726882590.4886763-30716-168475619066008=/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008 <<< 29946 1726882590.51398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.51421: stderr chunk (state=3): >>><<< 29946 1726882590.51424: stdout chunk (state=3): >>><<< 29946 1726882590.51441: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882590.4886763-30716-168475619066008=/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.51462: variable 'ansible_module_compression' from source: unknown 29946 1726882590.51502: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882590.51530: variable 'ansible_facts' from source: unknown 29946 1726882590.51589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py 29946 1726882590.51679: Sending initial data 29946 1726882590.51682: Sent initial data (156 bytes) 29946 1726882590.52073: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.52111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882590.52114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882590.52116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.52118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882590.52122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.52163: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.52166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.52230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.53742: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882590.53746: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882590.53815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882590.53879: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpi911gde4 /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py <<< 29946 1726882590.53882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py" <<< 29946 1726882590.53939: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpi911gde4" to remote "/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py" <<< 29946 1726882590.54559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.54595: stderr chunk (state=3): >>><<< 29946 1726882590.54598: stdout chunk (state=3): >>><<< 29946 1726882590.54616: done transferring module to remote 29946 1726882590.54624: _low_level_execute_command(): starting 29946 1726882590.54629: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/ /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py && sleep 0' 29946 1726882590.55014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.55019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.55033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.55083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.55091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.55154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.56871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.56899: stderr chunk (state=3): >>><<< 29946 1726882590.56903: stdout chunk (state=3): >>><<< 29946 1726882590.56915: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.56918: _low_level_execute_command(): starting 29946 1726882590.56921: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/AnsiballZ_command.py && sleep 0' 29946 1726882590.57347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.57350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.57352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882590.57354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.57356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.57399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.57418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.57482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.72883: stdout chunk (state=3): >>> {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-20 21:36:30.722923", "end": "2024-09-20 21:36:30.727744", "delta": "0:00:00.004821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882590.74222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882590.74243: stderr chunk (state=3): >>><<< 29946 1726882590.74246: stdout chunk (state=3): >>><<< 29946 1726882590.74265: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30200:\tfrom 198.51.100.58/26 lookup 30200 proto static\n30201:\tfrom all fwmark 0x1/0x1 lookup 30200 proto static\n30202:\tfrom all ipproto tcp lookup 30200 proto static\n30203:\tfrom all sport 128-256 lookup 30200 proto static\n30204:\tfrom all tos throughput lookup 30200 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30200"], "start": "2024-09-20 21:36:30.722923", "end": "2024-09-20 21:36:30.727744", "delta": "0:00:00.004821", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30200", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882590.74297: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30200', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882590.74304: _low_level_execute_command(): starting 29946 1726882590.74309: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882590.4886763-30716-168475619066008/ > /dev/null 2>&1 && sleep 0' 29946 1726882590.74734: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.74737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.74744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882590.74746: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882590.74748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.74786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.74790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.74860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.76743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.76747: stdout chunk (state=3): >>><<< 29946 1726882590.76750: stderr chunk (state=3): >>><<< 29946 1726882590.76767: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.76817: handler run complete 29946 1726882590.76825: Evaluated conditional (False): False 29946 1726882590.76844: attempt loop complete, returning result 29946 1726882590.76853: _execute() done 29946 1726882590.76899: dumping result to json 29946 1726882590.76903: done dumping result, returning 29946 1726882590.76906: done running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30200 [12673a56-9f93-95e7-9dfb-00000000005c] 29946 1726882590.76908: sending task result for task 12673a56-9f93-95e7-9dfb-00000000005c ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30200" ], "delta": "0:00:00.004821", "end": "2024-09-20 21:36:30.727744", "rc": 0, "start": "2024-09-20 21:36:30.722923" } STDOUT: 30200: from 198.51.100.58/26 lookup 30200 proto static 30201: from all fwmark 0x1/0x1 lookup 30200 proto static 30202: from all ipproto tcp lookup 30200 proto static 30203: from all sport 128-256 lookup 30200 proto static 30204: from all tos throughput lookup 30200 proto static 29946 1726882590.77119: no more pending results, returning what we have 29946 1726882590.77122: results queue empty 29946 1726882590.77123: checking for any_errors_fatal 29946 1726882590.77124: done checking for any_errors_fatal 29946 1726882590.77125: checking for max_fail_percentage 29946 1726882590.77127: done checking for max_fail_percentage 29946 1726882590.77128: checking to see if all hosts have failed and the running result is not ok 29946 1726882590.77128: done checking to see if all hosts have failed 29946 1726882590.77129: getting the remaining hosts for this loop 29946 1726882590.77130: done getting the remaining hosts for this loop 29946 1726882590.77134: getting the next task for host managed_node2 29946 1726882590.77141: done getting next task for host managed_node2 29946 1726882590.77143: ^ task is: TASK: Get the routing rule for looking up the table 30400 29946 1726882590.77145: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882590.77149: getting variables 29946 1726882590.77150: in VariableManager get_vars() 29946 1726882590.77186: Calling all_inventory to load vars for managed_node2 29946 1726882590.77189: Calling groups_inventory to load vars for managed_node2 29946 1726882590.77191: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882590.77206: Calling all_plugins_play to load vars for managed_node2 29946 1726882590.77209: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882590.77213: Calling groups_plugins_play to load vars for managed_node2 29946 1726882590.77823: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000005c 29946 1726882590.77826: WORKER PROCESS EXITING 29946 1726882590.79749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882590.81537: done with get_vars() 29946 1726882590.81553: done getting variables 29946 1726882590.81598: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30400] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:122 Friday 20 September 2024 21:36:30 -0400 (0:00:00.373) 0:00:16.925 ****** 29946 1726882590.81619: entering _queue_task() for managed_node2/command 29946 1726882590.81835: worker is 1 (out of 1 available) 29946 1726882590.81848: exiting _queue_task() for managed_node2/command 29946 1726882590.81858: done queuing things up, now waiting for results queue to drain 29946 1726882590.81860: waiting for pending results... 29946 1726882590.82034: running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30400 29946 1726882590.82095: in run() - task 12673a56-9f93-95e7-9dfb-00000000005d 29946 1726882590.82109: variable 'ansible_search_path' from source: unknown 29946 1726882590.82137: calling self._execute() 29946 1726882590.82215: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.82219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.82224: variable 'omit' from source: magic vars 29946 1726882590.82506: variable 'ansible_distribution_major_version' from source: facts 29946 1726882590.82516: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882590.82726: variable 'ansible_distribution_major_version' from source: facts 29946 1726882590.82730: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882590.82732: variable 'omit' from source: magic vars 29946 1726882590.82735: variable 'omit' from source: magic vars 29946 1726882590.82737: variable 'omit' from source: magic vars 29946 1726882590.82998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882590.83002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882590.83005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882590.83007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882590.83009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882590.83011: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882590.83014: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.83016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.83018: Set connection var ansible_pipelining to False 29946 1726882590.83020: Set connection var ansible_shell_executable to /bin/sh 29946 1726882590.83022: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882590.83024: Set connection var ansible_timeout to 10 29946 1726882590.83026: Set connection var ansible_shell_type to sh 29946 1726882590.83028: Set connection var ansible_connection to ssh 29946 1726882590.83050: variable 'ansible_shell_executable' from source: unknown 29946 1726882590.83053: variable 'ansible_connection' from source: unknown 29946 1726882590.83056: variable 'ansible_module_compression' from source: unknown 29946 1726882590.83059: variable 'ansible_shell_type' from source: unknown 29946 1726882590.83061: variable 'ansible_shell_executable' from source: unknown 29946 1726882590.83063: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882590.83065: variable 'ansible_pipelining' from source: unknown 29946 1726882590.83066: variable 'ansible_timeout' from source: unknown 29946 1726882590.83069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882590.83185: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882590.83200: variable 'omit' from source: magic vars 29946 1726882590.83205: starting attempt loop 29946 1726882590.83208: running the handler 29946 1726882590.83223: _low_level_execute_command(): starting 29946 1726882590.83232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882590.83857: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.83874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.83886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.83931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.83944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.84013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.85586: stdout chunk (state=3): >>>/root <<< 29946 1726882590.85684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.85717: stderr chunk (state=3): >>><<< 29946 1726882590.85719: stdout chunk (state=3): >>><<< 29946 1726882590.85734: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.85750: _low_level_execute_command(): starting 29946 1726882590.85754: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949 `" && echo ansible-tmp-1726882590.8573916-30741-79179890353949="` echo /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949 `" ) && sleep 0' 29946 1726882590.86162: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.86166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882590.86170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882590.86178: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882590.86180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.86224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.86230: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.86297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.88159: stdout chunk (state=3): >>>ansible-tmp-1726882590.8573916-30741-79179890353949=/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949 <<< 29946 1726882590.88266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.88298: stderr chunk (state=3): >>><<< 29946 1726882590.88301: stdout chunk (state=3): >>><<< 29946 1726882590.88309: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882590.8573916-30741-79179890353949=/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.88334: variable 'ansible_module_compression' from source: unknown 29946 1726882590.88369: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882590.88401: variable 'ansible_facts' from source: unknown 29946 1726882590.88455: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py 29946 1726882590.88550: Sending initial data 29946 1726882590.88553: Sent initial data (155 bytes) 29946 1726882590.88947: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.88951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.88965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.89019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.89022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.89091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.90590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882590.90599: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882590.90650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882590.90717: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpvh_62otl /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py <<< 29946 1726882590.90719: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py" <<< 29946 1726882590.90773: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpvh_62otl" to remote "/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py" <<< 29946 1726882590.90777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py" <<< 29946 1726882590.91372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.91404: stderr chunk (state=3): >>><<< 29946 1726882590.91407: stdout chunk (state=3): >>><<< 29946 1726882590.91422: done transferring module to remote 29946 1726882590.91430: _low_level_execute_command(): starting 29946 1726882590.91433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/ /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py && sleep 0' 29946 1726882590.91838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.91841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.91843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.91845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882590.91850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.91885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882590.91888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.91961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882590.93673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882590.93688: stderr chunk (state=3): >>><<< 29946 1726882590.93695: stdout chunk (state=3): >>><<< 29946 1726882590.93709: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882590.93712: _low_level_execute_command(): starting 29946 1726882590.93715: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/AnsiballZ_command.py && sleep 0' 29946 1726882590.94126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882590.94129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882590.94132: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.94134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882590.94135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882590.94137: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882590.94182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882590.94190: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882590.94254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.09510: stdout chunk (state=3): >>> {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-20 21:36:31.090646", "end": "2024-09-20 21:36:31.094172", "delta": "0:00:00.003526", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882591.11061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882591.11065: stdout chunk (state=3): >>><<< 29946 1726882591.11067: stderr chunk (state=3): >>><<< 29946 1726882591.11086: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30400:\tfrom all to 198.51.100.128/26 lookup 30400 proto static\n30401:\tfrom all iif iiftest [detached] lookup 30400 proto static\n30402:\tfrom all oif oiftest [detached] lookup 30400 proto static\n30403:\tfrom all lookup 30400 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "30400"], "start": "2024-09-20 21:36:31.090646", "end": "2024-09-20 21:36:31.094172", "delta": "0:00:00.003526", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table 30400", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882591.11132: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table 30400', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882591.11166: _low_level_execute_command(): starting 29946 1726882591.11169: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882590.8573916-30741-79179890353949/ > /dev/null 2>&1 && sleep 0' 29946 1726882591.11786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882591.11803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.11823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.11844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.11946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.11975: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.12071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.13960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.13969: stdout chunk (state=3): >>><<< 29946 1726882591.13978: stderr chunk (state=3): >>><<< 29946 1726882591.14404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.14408: handler run complete 29946 1726882591.14410: Evaluated conditional (False): False 29946 1726882591.14416: attempt loop complete, returning result 29946 1726882591.14419: _execute() done 29946 1726882591.14421: dumping result to json 29946 1726882591.14423: done dumping result, returning 29946 1726882591.14425: done running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30400 [12673a56-9f93-95e7-9dfb-00000000005d] 29946 1726882591.14427: sending task result for task 12673a56-9f93-95e7-9dfb-00000000005d 29946 1726882591.14503: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000005d 29946 1726882591.14507: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "30400" ], "delta": "0:00:00.003526", "end": "2024-09-20 21:36:31.094172", "rc": 0, "start": "2024-09-20 21:36:31.090646" } STDOUT: 30400: from all to 198.51.100.128/26 lookup 30400 proto static 30401: from all iif iiftest [detached] lookup 30400 proto static 30402: from all oif oiftest [detached] lookup 30400 proto static 30403: from all lookup 30400 proto static 29946 1726882591.14828: no more pending results, returning what we have 29946 1726882591.14832: results queue empty 29946 1726882591.14833: checking for any_errors_fatal 29946 1726882591.14839: done checking for any_errors_fatal 29946 1726882591.14840: checking for max_fail_percentage 29946 1726882591.14842: done checking for max_fail_percentage 29946 1726882591.14843: checking to see if all hosts have failed and the running result is not ok 29946 1726882591.14844: done checking to see if all hosts have failed 29946 1726882591.14844: getting the remaining hosts for this loop 29946 1726882591.14845: done getting the remaining hosts for this loop 29946 1726882591.14848: getting the next task for host managed_node2 29946 1726882591.14853: done getting next task for host managed_node2 29946 1726882591.14856: ^ task is: TASK: Get the routing rule for looking up the table 30600 29946 1726882591.14858: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882591.14861: getting variables 29946 1726882591.14863: in VariableManager get_vars() 29946 1726882591.14897: Calling all_inventory to load vars for managed_node2 29946 1726882591.14900: Calling groups_inventory to load vars for managed_node2 29946 1726882591.14902: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882591.14912: Calling all_plugins_play to load vars for managed_node2 29946 1726882591.14914: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882591.14917: Calling groups_plugins_play to load vars for managed_node2 29946 1726882591.16205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882591.17736: done with get_vars() 29946 1726882591.17757: done getting variables 29946 1726882591.17815: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 30600] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:129 Friday 20 September 2024 21:36:31 -0400 (0:00:00.362) 0:00:17.287 ****** 29946 1726882591.17842: entering _queue_task() for managed_node2/command 29946 1726882591.18133: worker is 1 (out of 1 available) 29946 1726882591.18144: exiting _queue_task() for managed_node2/command 29946 1726882591.18157: done queuing things up, now waiting for results queue to drain 29946 1726882591.18158: waiting for pending results... 29946 1726882591.18612: running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30600 29946 1726882591.18618: in run() - task 12673a56-9f93-95e7-9dfb-00000000005e 29946 1726882591.18621: variable 'ansible_search_path' from source: unknown 29946 1726882591.18623: calling self._execute() 29946 1726882591.18656: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.18668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.18681: variable 'omit' from source: magic vars 29946 1726882591.19232: variable 'ansible_distribution_major_version' from source: facts 29946 1726882591.19296: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882591.19522: variable 'ansible_distribution_major_version' from source: facts 29946 1726882591.19715: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882591.19718: variable 'omit' from source: magic vars 29946 1726882591.19720: variable 'omit' from source: magic vars 29946 1726882591.19722: variable 'omit' from source: magic vars 29946 1726882591.19724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882591.19872: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882591.19897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882591.19950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882591.19991: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882591.20399: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882591.20403: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.20405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.20508: Set connection var ansible_pipelining to False 29946 1726882591.20511: Set connection var ansible_shell_executable to /bin/sh 29946 1726882591.20513: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882591.20515: Set connection var ansible_timeout to 10 29946 1726882591.20517: Set connection var ansible_shell_type to sh 29946 1726882591.20519: Set connection var ansible_connection to ssh 29946 1726882591.20520: variable 'ansible_shell_executable' from source: unknown 29946 1726882591.20523: variable 'ansible_connection' from source: unknown 29946 1726882591.20524: variable 'ansible_module_compression' from source: unknown 29946 1726882591.20526: variable 'ansible_shell_type' from source: unknown 29946 1726882591.20528: variable 'ansible_shell_executable' from source: unknown 29946 1726882591.20530: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.20531: variable 'ansible_pipelining' from source: unknown 29946 1726882591.20533: variable 'ansible_timeout' from source: unknown 29946 1726882591.20535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.20759: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882591.20776: variable 'omit' from source: magic vars 29946 1726882591.20810: starting attempt loop 29946 1726882591.20838: running the handler 29946 1726882591.20857: _low_level_execute_command(): starting 29946 1726882591.20954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882591.22256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882591.22271: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.22288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.22362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.22410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.22438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.22460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.22578: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.24132: stdout chunk (state=3): >>>/root <<< 29946 1726882591.24267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.24277: stdout chunk (state=3): >>><<< 29946 1726882591.24290: stderr chunk (state=3): >>><<< 29946 1726882591.24320: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.24340: _low_level_execute_command(): starting 29946 1726882591.24482: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308 `" && echo ansible-tmp-1726882591.2432673-30756-270581647958308="` echo /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308 `" ) && sleep 0' 29946 1726882591.25745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.25791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.25846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.27711: stdout chunk (state=3): >>>ansible-tmp-1726882591.2432673-30756-270581647958308=/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308 <<< 29946 1726882591.27852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.27863: stdout chunk (state=3): >>><<< 29946 1726882591.27875: stderr chunk (state=3): >>><<< 29946 1726882591.27900: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882591.2432673-30756-270581647958308=/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.28199: variable 'ansible_module_compression' from source: unknown 29946 1726882591.28202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882591.28204: variable 'ansible_facts' from source: unknown 29946 1726882591.28400: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py 29946 1726882591.28614: Sending initial data 29946 1726882591.28623: Sent initial data (156 bytes) 29946 1726882591.30301: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.30318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.30357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.30416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.31938: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29946 1726882591.31952: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 29946 1726882591.31968: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882591.32323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882591.32386: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpn4m_xlka /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py <<< 29946 1726882591.32392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py" <<< 29946 1726882591.32450: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpn4m_xlka" to remote "/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py" <<< 29946 1726882591.34298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.34399: stderr chunk (state=3): >>><<< 29946 1726882591.34402: stdout chunk (state=3): >>><<< 29946 1726882591.34405: done transferring module to remote 29946 1726882591.34407: _low_level_execute_command(): starting 29946 1726882591.34409: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/ /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py && sleep 0' 29946 1726882591.35709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.35826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.35862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.35920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.37681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.37691: stdout chunk (state=3): >>><<< 29946 1726882591.37706: stderr chunk (state=3): >>><<< 29946 1726882591.37725: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.37738: _low_level_execute_command(): starting 29946 1726882591.37748: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/AnsiballZ_command.py && sleep 0' 29946 1726882591.38316: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882591.38330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.38342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.38359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.38378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882591.38389: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882591.38405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.38424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882591.38435: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882591.38445: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29946 1726882591.38455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.38501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.38563: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.38608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.38631: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.38830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.54112: stdout chunk (state=3): >>> {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-20 21:36:31.536709", "end": "2024-09-20 21:36:31.540142", "delta": "0:00:00.003433", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882591.55810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882591.55814: stdout chunk (state=3): >>><<< 29946 1726882591.55817: stderr chunk (state=3): >>><<< 29946 1726882591.55819: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "30600:\tfrom all to 2001:db8::4/32 lookup 30600 proto static\n30601:\tnot from all dport 128-256 lookup 30600 proto static\n30602:\tfrom all lookup 30600 proto static", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "rule", "list", "table", "30600"], "start": "2024-09-20 21:36:31.536709", "end": "2024-09-20 21:36:31.540142", "delta": "0:00:00.003433", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 rule list table 30600", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882591.55823: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 rule list table 30600', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882591.55825: _low_level_execute_command(): starting 29946 1726882591.55827: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882591.2432673-30756-270581647958308/ > /dev/null 2>&1 && sleep 0' 29946 1726882591.56637: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.56641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882591.56643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.56645: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.56647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.56804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.56867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.58633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.58664: stderr chunk (state=3): >>><<< 29946 1726882591.58673: stdout chunk (state=3): >>><<< 29946 1726882591.58696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.58713: handler run complete 29946 1726882591.58743: Evaluated conditional (False): False 29946 1726882591.58760: attempt loop complete, returning result 29946 1726882591.58768: _execute() done 29946 1726882591.58776: dumping result to json 29946 1726882591.58786: done dumping result, returning 29946 1726882591.58898: done running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 30600 [12673a56-9f93-95e7-9dfb-00000000005e] 29946 1726882591.58902: sending task result for task 12673a56-9f93-95e7-9dfb-00000000005e 29946 1726882591.58980: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000005e 29946 1726882591.58982: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "-6", "rule", "list", "table", "30600" ], "delta": "0:00:00.003433", "end": "2024-09-20 21:36:31.540142", "rc": 0, "start": "2024-09-20 21:36:31.536709" } STDOUT: 30600: from all to 2001:db8::4/32 lookup 30600 proto static 30601: not from all dport 128-256 lookup 30600 proto static 30602: from all lookup 30600 proto static 29946 1726882591.59064: no more pending results, returning what we have 29946 1726882591.59068: results queue empty 29946 1726882591.59069: checking for any_errors_fatal 29946 1726882591.59077: done checking for any_errors_fatal 29946 1726882591.59078: checking for max_fail_percentage 29946 1726882591.59081: done checking for max_fail_percentage 29946 1726882591.59082: checking to see if all hosts have failed and the running result is not ok 29946 1726882591.59083: done checking to see if all hosts have failed 29946 1726882591.59083: getting the remaining hosts for this loop 29946 1726882591.59085: done getting the remaining hosts for this loop 29946 1726882591.59089: getting the next task for host managed_node2 29946 1726882591.59097: done getting next task for host managed_node2 29946 1726882591.59100: ^ task is: TASK: Get the routing rule for looking up the table 'custom' 29946 1726882591.59103: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882591.59108: getting variables 29946 1726882591.59110: in VariableManager get_vars() 29946 1726882591.59151: Calling all_inventory to load vars for managed_node2 29946 1726882591.59154: Calling groups_inventory to load vars for managed_node2 29946 1726882591.59156: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882591.59169: Calling all_plugins_play to load vars for managed_node2 29946 1726882591.59172: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882591.59175: Calling groups_plugins_play to load vars for managed_node2 29946 1726882591.61578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882591.66478: done with get_vars() 29946 1726882591.66503: done getting variables 29946 1726882591.66551: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get the routing rule for looking up the table 'custom'] ****************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:136 Friday 20 September 2024 21:36:31 -0400 (0:00:00.487) 0:00:17.775 ****** 29946 1726882591.66574: entering _queue_task() for managed_node2/command 29946 1726882591.66875: worker is 1 (out of 1 available) 29946 1726882591.66889: exiting _queue_task() for managed_node2/command 29946 1726882591.66917: done queuing things up, now waiting for results queue to drain 29946 1726882591.66920: waiting for pending results... 29946 1726882591.67092: running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 'custom' 29946 1726882591.67163: in run() - task 12673a56-9f93-95e7-9dfb-00000000005f 29946 1726882591.67176: variable 'ansible_search_path' from source: unknown 29946 1726882591.67214: calling self._execute() 29946 1726882591.67295: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.67303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.67312: variable 'omit' from source: magic vars 29946 1726882591.67601: variable 'ansible_distribution_major_version' from source: facts 29946 1726882591.67612: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882591.67691: variable 'ansible_distribution_major_version' from source: facts 29946 1726882591.67696: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882591.67699: variable 'omit' from source: magic vars 29946 1726882591.67726: variable 'omit' from source: magic vars 29946 1726882591.67752: variable 'omit' from source: magic vars 29946 1726882591.67785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882591.67818: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882591.67832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882591.67845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882591.67854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882591.67878: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882591.67881: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.67884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.67955: Set connection var ansible_pipelining to False 29946 1726882591.67959: Set connection var ansible_shell_executable to /bin/sh 29946 1726882591.67964: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882591.67969: Set connection var ansible_timeout to 10 29946 1726882591.67975: Set connection var ansible_shell_type to sh 29946 1726882591.67977: Set connection var ansible_connection to ssh 29946 1726882591.67997: variable 'ansible_shell_executable' from source: unknown 29946 1726882591.68001: variable 'ansible_connection' from source: unknown 29946 1726882591.68003: variable 'ansible_module_compression' from source: unknown 29946 1726882591.68006: variable 'ansible_shell_type' from source: unknown 29946 1726882591.68008: variable 'ansible_shell_executable' from source: unknown 29946 1726882591.68010: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882591.68013: variable 'ansible_pipelining' from source: unknown 29946 1726882591.68015: variable 'ansible_timeout' from source: unknown 29946 1726882591.68020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882591.68117: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882591.68127: variable 'omit' from source: magic vars 29946 1726882591.68131: starting attempt loop 29946 1726882591.68136: running the handler 29946 1726882591.68149: _low_level_execute_command(): starting 29946 1726882591.68156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882591.68901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.68916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.68921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.68998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.70713: stdout chunk (state=3): >>>/root <<< 29946 1726882591.70811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.70842: stderr chunk (state=3): >>><<< 29946 1726882591.70845: stdout chunk (state=3): >>><<< 29946 1726882591.70863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.70874: _low_level_execute_command(): starting 29946 1726882591.70878: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439 `" && echo ansible-tmp-1726882591.708631-30782-205532069678439="` echo /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439 `" ) && sleep 0' 29946 1726882591.71274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.71313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.71316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.71319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.71330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.71368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.71372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.71440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.73359: stdout chunk (state=3): >>>ansible-tmp-1726882591.708631-30782-205532069678439=/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439 <<< 29946 1726882591.73474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.73495: stderr chunk (state=3): >>><<< 29946 1726882591.73499: stdout chunk (state=3): >>><<< 29946 1726882591.73513: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882591.708631-30782-205532069678439=/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.73537: variable 'ansible_module_compression' from source: unknown 29946 1726882591.73577: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882591.73613: variable 'ansible_facts' from source: unknown 29946 1726882591.73661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py 29946 1726882591.73756: Sending initial data 29946 1726882591.73760: Sent initial data (155 bytes) 29946 1726882591.74165: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882591.74169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.74171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882591.74174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.74176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.74227: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.74234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.74305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.75897: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882591.75905: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882591.75956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882591.76019: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpgfl04_qo /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py <<< 29946 1726882591.76023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py" <<< 29946 1726882591.76075: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpgfl04_qo" to remote "/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py" <<< 29946 1726882591.76667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.76703: stderr chunk (state=3): >>><<< 29946 1726882591.76706: stdout chunk (state=3): >>><<< 29946 1726882591.76738: done transferring module to remote 29946 1726882591.76747: _low_level_execute_command(): starting 29946 1726882591.76751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/ /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py && sleep 0' 29946 1726882591.77144: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.77149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.77152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882591.77155: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.77157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.77215: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.77218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.77270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.79037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.79053: stderr chunk (state=3): >>><<< 29946 1726882591.79057: stdout chunk (state=3): >>><<< 29946 1726882591.79068: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.79071: _low_level_execute_command(): starting 29946 1726882591.79074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/AnsiballZ_command.py && sleep 0' 29946 1726882591.79468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882591.79471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882591.79474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882591.79476: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882591.79478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.79535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.79537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.79601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.95381: stdout chunk (state=3): >>> {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-20 21:36:31.949194", "end": "2024-09-20 21:36:31.952921", "delta": "0:00:00.003727", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882591.97114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882591.97118: stdout chunk (state=3): >>><<< 29946 1726882591.97120: stderr chunk (state=3): >>><<< 29946 1726882591.97123: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "200:\tfrom 198.51.100.56/26 lookup custom proto static", "stderr": "", "rc": 0, "cmd": ["ip", "rule", "list", "table", "custom"], "start": "2024-09-20 21:36:31.949194", "end": "2024-09-20 21:36:31.952921", "delta": "0:00:00.003727", "msg": "", "invocation": {"module_args": {"_raw_params": "ip rule list table custom", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882591.97125: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip rule list table custom', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882591.97128: _low_level_execute_command(): starting 29946 1726882591.97130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882591.708631-30782-205532069678439/ > /dev/null 2>&1 && sleep 0' 29946 1726882591.97739: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882591.97808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882591.97852: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882591.97864: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882591.97907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882591.97963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882591.99857: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882591.99861: stdout chunk (state=3): >>><<< 29946 1726882591.99867: stderr chunk (state=3): >>><<< 29946 1726882591.99878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882591.99885: handler run complete 29946 1726882591.99910: Evaluated conditional (False): False 29946 1726882591.99918: attempt loop complete, returning result 29946 1726882591.99921: _execute() done 29946 1726882591.99923: dumping result to json 29946 1726882591.99929: done dumping result, returning 29946 1726882591.99936: done running TaskExecutor() for managed_node2/TASK: Get the routing rule for looking up the table 'custom' [12673a56-9f93-95e7-9dfb-00000000005f] 29946 1726882591.99940: sending task result for task 12673a56-9f93-95e7-9dfb-00000000005f 29946 1726882592.00045: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000005f 29946 1726882592.00048: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "rule", "list", "table", "custom" ], "delta": "0:00:00.003727", "end": "2024-09-20 21:36:31.952921", "rc": 0, "start": "2024-09-20 21:36:31.949194" } STDOUT: 200: from 198.51.100.56/26 lookup custom proto static 29946 1726882592.00121: no more pending results, returning what we have 29946 1726882592.00124: results queue empty 29946 1726882592.00125: checking for any_errors_fatal 29946 1726882592.00135: done checking for any_errors_fatal 29946 1726882592.00135: checking for max_fail_percentage 29946 1726882592.00138: done checking for max_fail_percentage 29946 1726882592.00139: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.00140: done checking to see if all hosts have failed 29946 1726882592.00140: getting the remaining hosts for this loop 29946 1726882592.00142: done getting the remaining hosts for this loop 29946 1726882592.00145: getting the next task for host managed_node2 29946 1726882592.00151: done getting next task for host managed_node2 29946 1726882592.00153: ^ task is: TASK: Get the IPv4 routing rule for the connection "{{ interface }}" 29946 1726882592.00155: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.00166: getting variables 29946 1726882592.00168: in VariableManager get_vars() 29946 1726882592.00209: Calling all_inventory to load vars for managed_node2 29946 1726882592.00211: Calling groups_inventory to load vars for managed_node2 29946 1726882592.00213: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.00223: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.00225: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.00227: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.01049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.02360: done with get_vars() 29946 1726882592.02377: done getting variables 29946 1726882592.02422: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882592.02515: variable 'interface' from source: set_fact TASK [Get the IPv4 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:143 Friday 20 September 2024 21:36:32 -0400 (0:00:00.359) 0:00:18.134 ****** 29946 1726882592.02537: entering _queue_task() for managed_node2/command 29946 1726882592.02759: worker is 1 (out of 1 available) 29946 1726882592.02772: exiting _queue_task() for managed_node2/command 29946 1726882592.02784: done queuing things up, now waiting for results queue to drain 29946 1726882592.02786: waiting for pending results... 29946 1726882592.02960: running TaskExecutor() for managed_node2/TASK: Get the IPv4 routing rule for the connection "ethtest0" 29946 1726882592.03025: in run() - task 12673a56-9f93-95e7-9dfb-000000000060 29946 1726882592.03038: variable 'ansible_search_path' from source: unknown 29946 1726882592.03072: calling self._execute() 29946 1726882592.03149: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.03156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.03163: variable 'omit' from source: magic vars 29946 1726882592.03454: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.03467: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.03470: variable 'omit' from source: magic vars 29946 1726882592.03490: variable 'omit' from source: magic vars 29946 1726882592.03562: variable 'interface' from source: set_fact 29946 1726882592.03577: variable 'omit' from source: magic vars 29946 1726882592.03612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.03639: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.03654: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.03669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.03681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.03715: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.03719: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.03721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.03825: Set connection var ansible_pipelining to False 29946 1726882592.03828: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.03899: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.03901: Set connection var ansible_timeout to 10 29946 1726882592.03904: Set connection var ansible_shell_type to sh 29946 1726882592.03906: Set connection var ansible_connection to ssh 29946 1726882592.03920: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.03932: variable 'ansible_connection' from source: unknown 29946 1726882592.03941: variable 'ansible_module_compression' from source: unknown 29946 1726882592.04039: variable 'ansible_shell_type' from source: unknown 29946 1726882592.04042: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.04044: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.04046: variable 'ansible_pipelining' from source: unknown 29946 1726882592.04048: variable 'ansible_timeout' from source: unknown 29946 1726882592.04050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.04129: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.04147: variable 'omit' from source: magic vars 29946 1726882592.04164: starting attempt loop 29946 1726882592.04176: running the handler 29946 1726882592.04201: _low_level_execute_command(): starting 29946 1726882592.04214: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882592.05014: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.05060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.05068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.05132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.06827: stdout chunk (state=3): >>>/root <<< 29946 1726882592.06979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.06982: stdout chunk (state=3): >>><<< 29946 1726882592.06984: stderr chunk (state=3): >>><<< 29946 1726882592.07017: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.07114: _low_level_execute_command(): starting 29946 1726882592.07118: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125 `" && echo ansible-tmp-1726882592.0702405-30804-134212737591125="` echo /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125 `" ) && sleep 0' 29946 1726882592.07742: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.07788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.09729: stdout chunk (state=3): >>>ansible-tmp-1726882592.0702405-30804-134212737591125=/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125 <<< 29946 1726882592.09876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.09879: stdout chunk (state=3): >>><<< 29946 1726882592.09882: stderr chunk (state=3): >>><<< 29946 1726882592.09982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882592.0702405-30804-134212737591125=/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.09986: variable 'ansible_module_compression' from source: unknown 29946 1726882592.09988: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882592.10199: variable 'ansible_facts' from source: unknown 29946 1726882592.10202: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py 29946 1726882592.10325: Sending initial data 29946 1726882592.10329: Sent initial data (156 bytes) 29946 1726882592.10828: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882592.10837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.10849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.10888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882592.10891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882592.10898: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882592.10900: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.10938: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.10953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.11018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.12585: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882592.12590: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882592.12661: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882592.12721: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp4ej6yl0p /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py <<< 29946 1726882592.12727: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py" <<< 29946 1726882592.12781: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp4ej6yl0p" to remote "/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py" <<< 29946 1726882592.13389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.13420: stderr chunk (state=3): >>><<< 29946 1726882592.13424: stdout chunk (state=3): >>><<< 29946 1726882592.13439: done transferring module to remote 29946 1726882592.13451: _low_level_execute_command(): starting 29946 1726882592.13454: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/ /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py && sleep 0' 29946 1726882592.13853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.13856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882592.13859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882592.13861: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.13863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.13914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.13917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.13982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.15784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.15797: stdout chunk (state=3): >>><<< 29946 1726882592.15800: stderr chunk (state=3): >>><<< 29946 1726882592.15808: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.15811: _low_level_execute_command(): starting 29946 1726882592.15819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/AnsiballZ_command.py && sleep 0' 29946 1726882592.16230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.16233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882592.16236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.16238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.16240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.16288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.16291: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.16367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.33180: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 21:36:32.313709", "end": "2024-09-20 21:36:32.330829", "delta": "0:00:00.017120", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882592.35000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882592.35005: stdout chunk (state=3): >>><<< 29946 1726882592.35008: stderr chunk (state=3): >>><<< 29946 1726882592.35011: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 21:36:32.313709", "end": "2024-09-20 21:36:32.330829", "delta": "0:00:00.017120", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv4.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882592.35014: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv4.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882592.35018: _low_level_execute_command(): starting 29946 1726882592.35020: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882592.0702405-30804-134212737591125/ > /dev/null 2>&1 && sleep 0' 29946 1726882592.35783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.35796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882592.35847: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.35892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.35900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.35939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.36003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.37942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.37946: stdout chunk (state=3): >>><<< 29946 1726882592.37948: stderr chunk (state=3): >>><<< 29946 1726882592.37965: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.38125: handler run complete 29946 1726882592.38128: Evaluated conditional (False): False 29946 1726882592.38131: attempt loop complete, returning result 29946 1726882592.38133: _execute() done 29946 1726882592.38136: dumping result to json 29946 1726882592.38137: done dumping result, returning 29946 1726882592.38140: done running TaskExecutor() for managed_node2/TASK: Get the IPv4 routing rule for the connection "ethtest0" [12673a56-9f93-95e7-9dfb-000000000060] 29946 1726882592.38142: sending task result for task 12673a56-9f93-95e7-9dfb-000000000060 ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv4.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.017120", "end": "2024-09-20 21:36:32.330829", "rc": 0, "start": "2024-09-20 21:36:32.313709" } STDOUT: ipv4.routing-rules: priority 30200 from 198.51.100.58/26 table 30200, priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200, priority 30202 from 0.0.0.0/0 ipproto 6 table 30200, priority 30203 from 0.0.0.0/0 sport 128-256 table 30200, priority 30204 from 0.0.0.0/0 tos 0x08 table 30200, priority 30400 to 198.51.100.128/26 table 30400, priority 30401 from 0.0.0.0/0 iif iiftest table 30400, priority 30402 from 0.0.0.0/0 oif oiftest table 30400, priority 30403 from 0.0.0.0/0 table 30400, priority 200 from 198.51.100.56/26 table 200 29946 1726882592.38298: no more pending results, returning what we have 29946 1726882592.38302: results queue empty 29946 1726882592.38303: checking for any_errors_fatal 29946 1726882592.38310: done checking for any_errors_fatal 29946 1726882592.38311: checking for max_fail_percentage 29946 1726882592.38314: done checking for max_fail_percentage 29946 1726882592.38314: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.38315: done checking to see if all hosts have failed 29946 1726882592.38316: getting the remaining hosts for this loop 29946 1726882592.38317: done getting the remaining hosts for this loop 29946 1726882592.38321: getting the next task for host managed_node2 29946 1726882592.38327: done getting next task for host managed_node2 29946 1726882592.38329: ^ task is: TASK: Get the IPv6 routing rule for the connection "{{ interface }}" 29946 1726882592.38331: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.38335: getting variables 29946 1726882592.38336: in VariableManager get_vars() 29946 1726882592.38372: Calling all_inventory to load vars for managed_node2 29946 1726882592.38374: Calling groups_inventory to load vars for managed_node2 29946 1726882592.38376: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.38392: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.38499: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.38505: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000060 29946 1726882592.38509: WORKER PROCESS EXITING 29946 1726882592.38514: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.40368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.42042: done with get_vars() 29946 1726882592.42063: done getting variables 29946 1726882592.42128: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882592.42252: variable 'interface' from source: set_fact TASK [Get the IPv6 routing rule for the connection "ethtest0"] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:149 Friday 20 September 2024 21:36:32 -0400 (0:00:00.397) 0:00:18.532 ****** 29946 1726882592.42280: entering _queue_task() for managed_node2/command 29946 1726882592.42633: worker is 1 (out of 1 available) 29946 1726882592.42759: exiting _queue_task() for managed_node2/command 29946 1726882592.42769: done queuing things up, now waiting for results queue to drain 29946 1726882592.42770: waiting for pending results... 29946 1726882592.43112: running TaskExecutor() for managed_node2/TASK: Get the IPv6 routing rule for the connection "ethtest0" 29946 1726882592.43117: in run() - task 12673a56-9f93-95e7-9dfb-000000000061 29946 1726882592.43120: variable 'ansible_search_path' from source: unknown 29946 1726882592.43123: calling self._execute() 29946 1726882592.43220: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.43228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.43236: variable 'omit' from source: magic vars 29946 1726882592.43646: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.43658: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.43664: variable 'omit' from source: magic vars 29946 1726882592.43685: variable 'omit' from source: magic vars 29946 1726882592.43801: variable 'interface' from source: set_fact 29946 1726882592.43819: variable 'omit' from source: magic vars 29946 1726882592.43871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.43999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.44002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.44004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.44007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.44009: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.44012: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.44014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.44133: Set connection var ansible_pipelining to False 29946 1726882592.44136: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.44143: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.44148: Set connection var ansible_timeout to 10 29946 1726882592.44155: Set connection var ansible_shell_type to sh 29946 1726882592.44158: Set connection var ansible_connection to ssh 29946 1726882592.44188: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.44194: variable 'ansible_connection' from source: unknown 29946 1726882592.44198: variable 'ansible_module_compression' from source: unknown 29946 1726882592.44200: variable 'ansible_shell_type' from source: unknown 29946 1726882592.44202: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.44204: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.44209: variable 'ansible_pipelining' from source: unknown 29946 1726882592.44212: variable 'ansible_timeout' from source: unknown 29946 1726882592.44216: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.44368: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.44377: variable 'omit' from source: magic vars 29946 1726882592.44392: starting attempt loop 29946 1726882592.44396: running the handler 29946 1726882592.44413: _low_level_execute_command(): starting 29946 1726882592.44421: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882592.45272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.45341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.45344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.45347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.45431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.47143: stdout chunk (state=3): >>>/root <<< 29946 1726882592.47279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.47305: stdout chunk (state=3): >>><<< 29946 1726882592.47309: stderr chunk (state=3): >>><<< 29946 1726882592.47416: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.47422: _low_level_execute_command(): starting 29946 1726882592.47425: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887 `" && echo ansible-tmp-1726882592.4733324-30819-21736861611887="` echo /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887 `" ) && sleep 0' 29946 1726882592.47967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882592.48086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.48112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.48127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.48225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.50117: stdout chunk (state=3): >>>ansible-tmp-1726882592.4733324-30819-21736861611887=/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887 <<< 29946 1726882592.50216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.50263: stderr chunk (state=3): >>><<< 29946 1726882592.50277: stdout chunk (state=3): >>><<< 29946 1726882592.50301: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882592.4733324-30819-21736861611887=/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.50345: variable 'ansible_module_compression' from source: unknown 29946 1726882592.50421: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882592.50456: variable 'ansible_facts' from source: unknown 29946 1726882592.50638: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py 29946 1726882592.50771: Sending initial data 29946 1726882592.50775: Sent initial data (155 bytes) 29946 1726882592.51339: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882592.51353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.51366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.51413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882592.51428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.51499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.51523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.51549: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.51635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.53205: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882592.53296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882592.53358: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpul3s0zar /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py <<< 29946 1726882592.53380: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py" <<< 29946 1726882592.53436: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpul3s0zar" to remote "/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py" <<< 29946 1726882592.54359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.54362: stderr chunk (state=3): >>><<< 29946 1726882592.54364: stdout chunk (state=3): >>><<< 29946 1726882592.54366: done transferring module to remote 29946 1726882592.54368: _low_level_execute_command(): starting 29946 1726882592.54371: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/ /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py && sleep 0' 29946 1726882592.55099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.55103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.55129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.55164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.55222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.57040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.57052: stderr chunk (state=3): >>><<< 29946 1726882592.57065: stdout chunk (state=3): >>><<< 29946 1726882592.57086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.57170: _low_level_execute_command(): starting 29946 1726882592.57174: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/AnsiballZ_command.py && sleep 0' 29946 1726882592.57695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882592.57711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.57735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.57753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882592.57810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.57870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.57888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.57920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.58017: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.74798: stdout chunk (state=3): >>> {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 21:36:32.730643", "end": "2024-09-20 21:36:32.747037", "delta": "0:00:00.016394", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882592.76352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882592.76356: stdout chunk (state=3): >>><<< 29946 1726882592.76365: stderr chunk (state=3): >>><<< 29946 1726882592.76382: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600", "stderr": "", "rc": 0, "cmd": ["nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0"], "start": "2024-09-20 21:36:32.730643", "end": "2024-09-20 21:36:32.747037", "delta": "0:00:00.016394", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f ipv6.routing-rules c show \"ethtest0\"", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882592.76428: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f ipv6.routing-rules c show "ethtest0"', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882592.76434: _low_level_execute_command(): starting 29946 1726882592.76442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882592.4733324-30819-21736861611887/ > /dev/null 2>&1 && sleep 0' 29946 1726882592.77127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882592.77148: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882592.77151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882592.77179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.77207: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882592.77215: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882592.77274: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882592.77307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882592.77319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882592.77336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882592.77427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882592.79252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882592.79272: stderr chunk (state=3): >>><<< 29946 1726882592.79277: stdout chunk (state=3): >>><<< 29946 1726882592.79296: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882592.79305: handler run complete 29946 1726882592.79322: Evaluated conditional (False): False 29946 1726882592.79330: attempt loop complete, returning result 29946 1726882592.79333: _execute() done 29946 1726882592.79335: dumping result to json 29946 1726882592.79341: done dumping result, returning 29946 1726882592.79351: done running TaskExecutor() for managed_node2/TASK: Get the IPv6 routing rule for the connection "ethtest0" [12673a56-9f93-95e7-9dfb-000000000061] 29946 1726882592.79354: sending task result for task 12673a56-9f93-95e7-9dfb-000000000061 29946 1726882592.79444: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000061 29946 1726882592.79446: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "nmcli", "-f", "ipv6.routing-rules", "c", "show", "ethtest0" ], "delta": "0:00:00.016394", "end": "2024-09-20 21:36:32.747037", "rc": 0, "start": "2024-09-20 21:36:32.730643" } STDOUT: ipv6.routing-rules: priority 30600 to 2001:db8::4/32 table 30600, priority 30601 not from ::/0 dport 128-256 table 30600, priority 30602 from ::/0 table 30600 29946 1726882592.79523: no more pending results, returning what we have 29946 1726882592.79526: results queue empty 29946 1726882592.79527: checking for any_errors_fatal 29946 1726882592.79536: done checking for any_errors_fatal 29946 1726882592.79537: checking for max_fail_percentage 29946 1726882592.79539: done checking for max_fail_percentage 29946 1726882592.79539: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.79540: done checking to see if all hosts have failed 29946 1726882592.79541: getting the remaining hosts for this loop 29946 1726882592.79542: done getting the remaining hosts for this loop 29946 1726882592.79545: getting the next task for host managed_node2 29946 1726882592.79552: done getting next task for host managed_node2 29946 1726882592.79555: ^ task is: TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 29946 1726882592.79557: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.79561: getting variables 29946 1726882592.79562: in VariableManager get_vars() 29946 1726882592.79599: Calling all_inventory to load vars for managed_node2 29946 1726882592.79602: Calling groups_inventory to load vars for managed_node2 29946 1726882592.79604: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.79616: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.79619: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.79621: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.80831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.81743: done with get_vars() 29946 1726882592.81759: done getting variables 29946 1726882592.81805: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30200 matches the specified rule] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:155 Friday 20 September 2024 21:36:32 -0400 (0:00:00.395) 0:00:18.927 ****** 29946 1726882592.81824: entering _queue_task() for managed_node2/assert 29946 1726882592.82029: worker is 1 (out of 1 available) 29946 1726882592.82043: exiting _queue_task() for managed_node2/assert 29946 1726882592.82055: done queuing things up, now waiting for results queue to drain 29946 1726882592.82057: waiting for pending results... 29946 1726882592.82230: running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule 29946 1726882592.82287: in run() - task 12673a56-9f93-95e7-9dfb-000000000062 29946 1726882592.82310: variable 'ansible_search_path' from source: unknown 29946 1726882592.82334: calling self._execute() 29946 1726882592.82414: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.82423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.82431: variable 'omit' from source: magic vars 29946 1726882592.82708: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.82719: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.82814: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.82817: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882592.82825: variable 'omit' from source: magic vars 29946 1726882592.82867: variable 'omit' from source: magic vars 29946 1726882592.82945: variable 'omit' from source: magic vars 29946 1726882592.82948: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.83100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.83103: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.83106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.83109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.83111: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.83114: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.83116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.83157: Set connection var ansible_pipelining to False 29946 1726882592.83162: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.83167: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.83173: Set connection var ansible_timeout to 10 29946 1726882592.83234: Set connection var ansible_shell_type to sh 29946 1726882592.83241: Set connection var ansible_connection to ssh 29946 1726882592.83243: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.83246: variable 'ansible_connection' from source: unknown 29946 1726882592.83248: variable 'ansible_module_compression' from source: unknown 29946 1726882592.83250: variable 'ansible_shell_type' from source: unknown 29946 1726882592.83252: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.83254: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.83256: variable 'ansible_pipelining' from source: unknown 29946 1726882592.83258: variable 'ansible_timeout' from source: unknown 29946 1726882592.83260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.83368: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.83379: variable 'omit' from source: magic vars 29946 1726882592.83385: starting attempt loop 29946 1726882592.83388: running the handler 29946 1726882592.83581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882592.83800: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882592.84192: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882592.84365: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882592.84407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882592.84506: variable 'route_rule_table_30200' from source: set_fact 29946 1726882592.84537: Evaluated conditional (route_rule_table_30200.stdout is search("30200:(\s+)from 198.51.100.58/26 lookup 30200")): True 29946 1726882592.84679: variable 'route_rule_table_30200' from source: set_fact 29946 1726882592.84713: Evaluated conditional (route_rule_table_30200.stdout is search("30201:(\s+)from all fwmark 0x1/0x1 lookup 30200")): True 29946 1726882592.84897: variable 'route_rule_table_30200' from source: set_fact 29946 1726882592.84901: Evaluated conditional (route_rule_table_30200.stdout is search("30202:(\s+)from all ipproto tcp lookup 30200")): True 29946 1726882592.85011: variable 'route_rule_table_30200' from source: set_fact 29946 1726882592.85038: Evaluated conditional (route_rule_table_30200.stdout is search("30203:(\s+)from all sport 128-256 lookup 30200")): True 29946 1726882592.85172: variable 'route_rule_table_30200' from source: set_fact 29946 1726882592.85209: Evaluated conditional (route_rule_table_30200.stdout is search("30204:(\s+)from all tos (0x08|throughput) lookup 30200")): True 29946 1726882592.85224: handler run complete 29946 1726882592.85299: attempt loop complete, returning result 29946 1726882592.85302: _execute() done 29946 1726882592.85305: dumping result to json 29946 1726882592.85307: done dumping result, returning 29946 1726882592.85309: done running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30200 matches the specified rule [12673a56-9f93-95e7-9dfb-000000000062] 29946 1726882592.85311: sending task result for task 12673a56-9f93-95e7-9dfb-000000000062 29946 1726882592.85372: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000062 29946 1726882592.85374: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882592.85419: no more pending results, returning what we have 29946 1726882592.85422: results queue empty 29946 1726882592.85423: checking for any_errors_fatal 29946 1726882592.85430: done checking for any_errors_fatal 29946 1726882592.85430: checking for max_fail_percentage 29946 1726882592.85432: done checking for max_fail_percentage 29946 1726882592.85433: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.85434: done checking to see if all hosts have failed 29946 1726882592.85434: getting the remaining hosts for this loop 29946 1726882592.85435: done getting the remaining hosts for this loop 29946 1726882592.85438: getting the next task for host managed_node2 29946 1726882592.85444: done getting next task for host managed_node2 29946 1726882592.85446: ^ task is: TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 29946 1726882592.85448: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.85451: getting variables 29946 1726882592.85453: in VariableManager get_vars() 29946 1726882592.85481: Calling all_inventory to load vars for managed_node2 29946 1726882592.85483: Calling groups_inventory to load vars for managed_node2 29946 1726882592.85485: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.85502: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.85504: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.85507: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.88095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.89649: done with get_vars() 29946 1726882592.89663: done getting variables 29946 1726882592.89709: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30400 matches the specified rule] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:166 Friday 20 September 2024 21:36:32 -0400 (0:00:00.079) 0:00:19.006 ****** 29946 1726882592.89728: entering _queue_task() for managed_node2/assert 29946 1726882592.89938: worker is 1 (out of 1 available) 29946 1726882592.89950: exiting _queue_task() for managed_node2/assert 29946 1726882592.89961: done queuing things up, now waiting for results queue to drain 29946 1726882592.89963: waiting for pending results... 29946 1726882592.90132: running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule 29946 1726882592.90201: in run() - task 12673a56-9f93-95e7-9dfb-000000000063 29946 1726882592.90213: variable 'ansible_search_path' from source: unknown 29946 1726882592.90241: calling self._execute() 29946 1726882592.90318: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.90322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.90331: variable 'omit' from source: magic vars 29946 1726882592.90603: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.90613: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.90688: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.90696: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882592.90704: variable 'omit' from source: magic vars 29946 1726882592.90720: variable 'omit' from source: magic vars 29946 1726882592.90750: variable 'omit' from source: magic vars 29946 1726882592.90780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.90811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.90826: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.90844: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.90854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.90875: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.90878: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.90882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.91010: Set connection var ansible_pipelining to False 29946 1726882592.91013: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.91027: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.91030: Set connection var ansible_timeout to 10 29946 1726882592.91032: Set connection var ansible_shell_type to sh 29946 1726882592.91035: Set connection var ansible_connection to ssh 29946 1726882592.91142: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.91145: variable 'ansible_connection' from source: unknown 29946 1726882592.91147: variable 'ansible_module_compression' from source: unknown 29946 1726882592.91149: variable 'ansible_shell_type' from source: unknown 29946 1726882592.91152: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.91154: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.91156: variable 'ansible_pipelining' from source: unknown 29946 1726882592.91159: variable 'ansible_timeout' from source: unknown 29946 1726882592.91161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.91400: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.91403: variable 'omit' from source: magic vars 29946 1726882592.91406: starting attempt loop 29946 1726882592.91408: running the handler 29946 1726882592.91411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882592.91601: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882592.91642: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882592.91714: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882592.91752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882592.91833: variable 'route_rule_table_30400' from source: set_fact 29946 1726882592.91862: Evaluated conditional (route_rule_table_30400.stdout is search("30400:(\s+)from all to 198.51.100.128/26 lookup 30400")): True 29946 1726882592.91998: variable 'route_rule_table_30400' from source: set_fact 29946 1726882592.92024: Evaluated conditional (route_rule_table_30400.stdout is search("30401:(\s+)from all iif iiftest \[detached\] lookup 30400")): True 29946 1726882592.92137: variable 'route_rule_table_30400' from source: set_fact 29946 1726882592.92156: Evaluated conditional (route_rule_table_30400.stdout is search("30402:(\s+)from all oif oiftest \[detached\] lookup 30400")): True 29946 1726882592.92162: handler run complete 29946 1726882592.92172: attempt loop complete, returning result 29946 1726882592.92174: _execute() done 29946 1726882592.92177: dumping result to json 29946 1726882592.92180: done dumping result, returning 29946 1726882592.92187: done running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30400 matches the specified rule [12673a56-9f93-95e7-9dfb-000000000063] 29946 1726882592.92198: sending task result for task 12673a56-9f93-95e7-9dfb-000000000063 29946 1726882592.92273: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000063 29946 1726882592.92275: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882592.92343: no more pending results, returning what we have 29946 1726882592.92346: results queue empty 29946 1726882592.92347: checking for any_errors_fatal 29946 1726882592.92354: done checking for any_errors_fatal 29946 1726882592.92355: checking for max_fail_percentage 29946 1726882592.92357: done checking for max_fail_percentage 29946 1726882592.92357: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.92358: done checking to see if all hosts have failed 29946 1726882592.92359: getting the remaining hosts for this loop 29946 1726882592.92360: done getting the remaining hosts for this loop 29946 1726882592.92363: getting the next task for host managed_node2 29946 1726882592.92368: done getting next task for host managed_node2 29946 1726882592.92370: ^ task is: TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 29946 1726882592.92372: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.92375: getting variables 29946 1726882592.92376: in VariableManager get_vars() 29946 1726882592.92408: Calling all_inventory to load vars for managed_node2 29946 1726882592.92410: Calling groups_inventory to load vars for managed_node2 29946 1726882592.92412: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.92423: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.92425: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.92428: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.93204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.94166: done with get_vars() 29946 1726882592.94180: done getting variables 29946 1726882592.94222: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with table lookup 30600 matches the specified rule] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:175 Friday 20 September 2024 21:36:32 -0400 (0:00:00.045) 0:00:19.051 ****** 29946 1726882592.94240: entering _queue_task() for managed_node2/assert 29946 1726882592.94433: worker is 1 (out of 1 available) 29946 1726882592.94446: exiting _queue_task() for managed_node2/assert 29946 1726882592.94459: done queuing things up, now waiting for results queue to drain 29946 1726882592.94461: waiting for pending results... 29946 1726882592.94621: running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule 29946 1726882592.94678: in run() - task 12673a56-9f93-95e7-9dfb-000000000064 29946 1726882592.94696: variable 'ansible_search_path' from source: unknown 29946 1726882592.94722: calling self._execute() 29946 1726882592.94799: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.94804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.94813: variable 'omit' from source: magic vars 29946 1726882592.95071: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.95080: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.95160: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.95164: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882592.95171: variable 'omit' from source: magic vars 29946 1726882592.95186: variable 'omit' from source: magic vars 29946 1726882592.95216: variable 'omit' from source: magic vars 29946 1726882592.95248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.95274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.95292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.95308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.95318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.95342: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.95345: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.95349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.95415: Set connection var ansible_pipelining to False 29946 1726882592.95418: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.95424: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.95429: Set connection var ansible_timeout to 10 29946 1726882592.95435: Set connection var ansible_shell_type to sh 29946 1726882592.95437: Set connection var ansible_connection to ssh 29946 1726882592.95455: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.95460: variable 'ansible_connection' from source: unknown 29946 1726882592.95463: variable 'ansible_module_compression' from source: unknown 29946 1726882592.95465: variable 'ansible_shell_type' from source: unknown 29946 1726882592.95467: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.95469: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.95472: variable 'ansible_pipelining' from source: unknown 29946 1726882592.95474: variable 'ansible_timeout' from source: unknown 29946 1726882592.95476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.95572: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.95581: variable 'omit' from source: magic vars 29946 1726882592.95588: starting attempt loop 29946 1726882592.95591: running the handler 29946 1726882592.95702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882592.95859: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882592.95887: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882592.95944: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882592.95970: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882592.96038: variable 'route_rule_table_30600' from source: set_fact 29946 1726882592.96058: Evaluated conditional (route_rule_table_30600.stdout is search("30600:(\s+)from all to 2001:db8::4/32 lookup 30600")): True 29946 1726882592.96150: variable 'route_rule_table_30600' from source: set_fact 29946 1726882592.96170: Evaluated conditional (route_rule_table_30600.stdout is search("30601:(\s+)not from all dport 128-256 lookup 30600")): True 29946 1726882592.96175: handler run complete 29946 1726882592.96185: attempt loop complete, returning result 29946 1726882592.96188: _execute() done 29946 1726882592.96195: dumping result to json 29946 1726882592.96198: done dumping result, returning 29946 1726882592.96208: done running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with table lookup 30600 matches the specified rule [12673a56-9f93-95e7-9dfb-000000000064] 29946 1726882592.96211: sending task result for task 12673a56-9f93-95e7-9dfb-000000000064 29946 1726882592.96285: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000064 29946 1726882592.96288: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882592.96333: no more pending results, returning what we have 29946 1726882592.96337: results queue empty 29946 1726882592.96337: checking for any_errors_fatal 29946 1726882592.96346: done checking for any_errors_fatal 29946 1726882592.96346: checking for max_fail_percentage 29946 1726882592.96348: done checking for max_fail_percentage 29946 1726882592.96349: checking to see if all hosts have failed and the running result is not ok 29946 1726882592.96350: done checking to see if all hosts have failed 29946 1726882592.96350: getting the remaining hosts for this loop 29946 1726882592.96352: done getting the remaining hosts for this loop 29946 1726882592.96355: getting the next task for host managed_node2 29946 1726882592.96360: done getting next task for host managed_node2 29946 1726882592.96362: ^ task is: TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 29946 1726882592.96364: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882592.96368: getting variables 29946 1726882592.96369: in VariableManager get_vars() 29946 1726882592.96409: Calling all_inventory to load vars for managed_node2 29946 1726882592.96412: Calling groups_inventory to load vars for managed_node2 29946 1726882592.96415: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882592.96424: Calling all_plugins_play to load vars for managed_node2 29946 1726882592.96426: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882592.96429: Calling groups_plugins_play to load vars for managed_node2 29946 1726882592.97176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882592.98040: done with get_vars() 29946 1726882592.98054: done getting variables 29946 1726882592.98091: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that the routing rule with 'custom' table lookup matches the specified rule] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:183 Friday 20 September 2024 21:36:32 -0400 (0:00:00.038) 0:00:19.090 ****** 29946 1726882592.98115: entering _queue_task() for managed_node2/assert 29946 1726882592.98300: worker is 1 (out of 1 available) 29946 1726882592.98313: exiting _queue_task() for managed_node2/assert 29946 1726882592.98325: done queuing things up, now waiting for results queue to drain 29946 1726882592.98326: waiting for pending results... 29946 1726882592.98483: running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule 29946 1726882592.98540: in run() - task 12673a56-9f93-95e7-9dfb-000000000065 29946 1726882592.98553: variable 'ansible_search_path' from source: unknown 29946 1726882592.98583: calling self._execute() 29946 1726882592.98657: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.98661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.98672: variable 'omit' from source: magic vars 29946 1726882592.98930: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.98940: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882592.99018: variable 'ansible_distribution_major_version' from source: facts 29946 1726882592.99022: Evaluated conditional (ansible_distribution_major_version != "7"): True 29946 1726882592.99028: variable 'omit' from source: magic vars 29946 1726882592.99043: variable 'omit' from source: magic vars 29946 1726882592.99068: variable 'omit' from source: magic vars 29946 1726882592.99102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882592.99129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882592.99143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882592.99157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.99167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882592.99187: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882592.99195: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.99198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.99265: Set connection var ansible_pipelining to False 29946 1726882592.99269: Set connection var ansible_shell_executable to /bin/sh 29946 1726882592.99274: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882592.99279: Set connection var ansible_timeout to 10 29946 1726882592.99285: Set connection var ansible_shell_type to sh 29946 1726882592.99287: Set connection var ansible_connection to ssh 29946 1726882592.99310: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.99313: variable 'ansible_connection' from source: unknown 29946 1726882592.99316: variable 'ansible_module_compression' from source: unknown 29946 1726882592.99318: variable 'ansible_shell_type' from source: unknown 29946 1726882592.99322: variable 'ansible_shell_executable' from source: unknown 29946 1726882592.99324: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882592.99326: variable 'ansible_pipelining' from source: unknown 29946 1726882592.99329: variable 'ansible_timeout' from source: unknown 29946 1726882592.99331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882592.99426: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882592.99434: variable 'omit' from source: magic vars 29946 1726882592.99448: starting attempt loop 29946 1726882592.99454: running the handler 29946 1726882592.99547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882592.99708: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882592.99736: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882592.99786: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882592.99816: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882592.99873: variable 'route_rule_table_custom' from source: set_fact 29946 1726882592.99901: Evaluated conditional (route_rule_table_custom.stdout is search("200:(\s+)from 198.51.100.56/26 lookup custom")): True 29946 1726882592.99904: handler run complete 29946 1726882592.99916: attempt loop complete, returning result 29946 1726882592.99919: _execute() done 29946 1726882592.99921: dumping result to json 29946 1726882592.99923: done dumping result, returning 29946 1726882592.99930: done running TaskExecutor() for managed_node2/TASK: Assert that the routing rule with 'custom' table lookup matches the specified rule [12673a56-9f93-95e7-9dfb-000000000065] 29946 1726882592.99934: sending task result for task 12673a56-9f93-95e7-9dfb-000000000065 ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882593.00056: no more pending results, returning what we have 29946 1726882593.00059: results queue empty 29946 1726882593.00060: checking for any_errors_fatal 29946 1726882593.00067: done checking for any_errors_fatal 29946 1726882593.00068: checking for max_fail_percentage 29946 1726882593.00070: done checking for max_fail_percentage 29946 1726882593.00070: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.00071: done checking to see if all hosts have failed 29946 1726882593.00072: getting the remaining hosts for this loop 29946 1726882593.00073: done getting the remaining hosts for this loop 29946 1726882593.00076: getting the next task for host managed_node2 29946 1726882593.00081: done getting next task for host managed_node2 29946 1726882593.00084: ^ task is: TASK: Assert that the specified IPv4 routing rule was configured in the connection "{{ interface }}" 29946 1726882593.00085: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.00089: getting variables 29946 1726882593.00090: in VariableManager get_vars() 29946 1726882593.00122: Calling all_inventory to load vars for managed_node2 29946 1726882593.00125: Calling groups_inventory to load vars for managed_node2 29946 1726882593.00127: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.00135: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.00137: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.00140: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.00695: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000065 29946 1726882593.00698: WORKER PROCESS EXITING 29946 1726882593.01023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.01866: done with get_vars() 29946 1726882593.01879: done getting variables 29946 1726882593.01921: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882593.02004: variable 'interface' from source: set_fact TASK [Assert that the specified IPv4 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:190 Friday 20 September 2024 21:36:33 -0400 (0:00:00.039) 0:00:19.129 ****** 29946 1726882593.02029: entering _queue_task() for managed_node2/assert 29946 1726882593.02210: worker is 1 (out of 1 available) 29946 1726882593.02223: exiting _queue_task() for managed_node2/assert 29946 1726882593.02234: done queuing things up, now waiting for results queue to drain 29946 1726882593.02236: waiting for pending results... 29946 1726882593.02406: running TaskExecutor() for managed_node2/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" 29946 1726882593.02464: in run() - task 12673a56-9f93-95e7-9dfb-000000000066 29946 1726882593.02476: variable 'ansible_search_path' from source: unknown 29946 1726882593.02506: calling self._execute() 29946 1726882593.02587: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.02592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.02602: variable 'omit' from source: magic vars 29946 1726882593.02855: variable 'ansible_distribution_major_version' from source: facts 29946 1726882593.02865: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882593.02871: variable 'omit' from source: magic vars 29946 1726882593.02890: variable 'omit' from source: magic vars 29946 1726882593.02959: variable 'interface' from source: set_fact 29946 1726882593.02973: variable 'omit' from source: magic vars 29946 1726882593.03012: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882593.03035: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882593.03050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882593.03064: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.03073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.03098: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882593.03101: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.03104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.03170: Set connection var ansible_pipelining to False 29946 1726882593.03174: Set connection var ansible_shell_executable to /bin/sh 29946 1726882593.03179: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882593.03184: Set connection var ansible_timeout to 10 29946 1726882593.03191: Set connection var ansible_shell_type to sh 29946 1726882593.03196: Set connection var ansible_connection to ssh 29946 1726882593.03213: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.03216: variable 'ansible_connection' from source: unknown 29946 1726882593.03223: variable 'ansible_module_compression' from source: unknown 29946 1726882593.03227: variable 'ansible_shell_type' from source: unknown 29946 1726882593.03231: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.03234: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.03236: variable 'ansible_pipelining' from source: unknown 29946 1726882593.03238: variable 'ansible_timeout' from source: unknown 29946 1726882593.03240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.03345: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882593.03356: variable 'omit' from source: magic vars 29946 1726882593.03359: starting attempt loop 29946 1726882593.03362: running the handler 29946 1726882593.03466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882593.03623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882593.03652: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882593.03708: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882593.03733: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882593.03797: variable 'connection_route_rule' from source: set_fact 29946 1726882593.03816: Evaluated conditional (connection_route_rule.stdout is search("priority 30200 from 198.51.100.58/26 table 30200")): True 29946 1726882593.03908: variable 'connection_route_rule' from source: set_fact 29946 1726882593.03924: Evaluated conditional (connection_route_rule.stdout is search("priority 30201 from 0.0.0.0/0 fwmark 0x1/0x1 table 30200")): True 29946 1726882593.04010: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04027: Evaluated conditional (connection_route_rule.stdout is search("priority 30202 from 0.0.0.0/0 ipproto 6 table 30200")): True 29946 1726882593.04117: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04131: Evaluated conditional (connection_route_rule.stdout is search("priority 30203 from 0.0.0.0/0 sport 128-256 table 30200")): True 29946 1726882593.04216: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04234: Evaluated conditional (connection_route_rule.stdout is search("priority 30204 from 0.0.0.0/0 tos 0x08 table 30200")): True 29946 1726882593.04319: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04336: Evaluated conditional (connection_route_rule.stdout is search("priority 30400 to 198.51.100.128/26 table 30400")): True 29946 1726882593.04425: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04442: Evaluated conditional (connection_route_rule.stdout is search("priority 30401 from 0.0.0.0/0 iif iiftest table 30400")): True 29946 1726882593.04527: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04546: Evaluated conditional (connection_route_rule.stdout is search("priority 30402 from 0.0.0.0/0 oif oiftest table 30400")): True 29946 1726882593.04631: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04647: Evaluated conditional (connection_route_rule.stdout is search("priority 30403 from 0.0.0.0/0 table 30400")): True 29946 1726882593.04734: variable 'connection_route_rule' from source: set_fact 29946 1726882593.04751: Evaluated conditional (connection_route_rule.stdout is search("priority 200 from 198.51.100.56/26 table 200")): True 29946 1726882593.04755: handler run complete 29946 1726882593.04771: attempt loop complete, returning result 29946 1726882593.04774: _execute() done 29946 1726882593.04776: dumping result to json 29946 1726882593.04778: done dumping result, returning 29946 1726882593.04782: done running TaskExecutor() for managed_node2/TASK: Assert that the specified IPv4 routing rule was configured in the connection "ethtest0" [12673a56-9f93-95e7-9dfb-000000000066] 29946 1726882593.04784: sending task result for task 12673a56-9f93-95e7-9dfb-000000000066 29946 1726882593.04866: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000066 29946 1726882593.04868: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882593.04918: no more pending results, returning what we have 29946 1726882593.04922: results queue empty 29946 1726882593.04922: checking for any_errors_fatal 29946 1726882593.04931: done checking for any_errors_fatal 29946 1726882593.04932: checking for max_fail_percentage 29946 1726882593.04933: done checking for max_fail_percentage 29946 1726882593.04934: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.04935: done checking to see if all hosts have failed 29946 1726882593.04941: getting the remaining hosts for this loop 29946 1726882593.04942: done getting the remaining hosts for this loop 29946 1726882593.04946: getting the next task for host managed_node2 29946 1726882593.04951: done getting next task for host managed_node2 29946 1726882593.04954: ^ task is: TASK: Assert that the specified IPv6 routing rule was configured in the connection "{{ interface }}" 29946 1726882593.04956: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.04960: getting variables 29946 1726882593.04961: in VariableManager get_vars() 29946 1726882593.04999: Calling all_inventory to load vars for managed_node2 29946 1726882593.05002: Calling groups_inventory to load vars for managed_node2 29946 1726882593.05004: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.05013: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.05015: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.05017: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.05806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.06674: done with get_vars() 29946 1726882593.06691: done getting variables 29946 1726882593.06736: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882593.06818: variable 'interface' from source: set_fact TASK [Assert that the specified IPv6 routing rule was configured in the connection "ethtest0"] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:205 Friday 20 September 2024 21:36:33 -0400 (0:00:00.048) 0:00:19.177 ****** 29946 1726882593.06841: entering _queue_task() for managed_node2/assert 29946 1726882593.07054: worker is 1 (out of 1 available) 29946 1726882593.07067: exiting _queue_task() for managed_node2/assert 29946 1726882593.07078: done queuing things up, now waiting for results queue to drain 29946 1726882593.07080: waiting for pending results... 29946 1726882593.07246: running TaskExecutor() for managed_node2/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" 29946 1726882593.07303: in run() - task 12673a56-9f93-95e7-9dfb-000000000067 29946 1726882593.07319: variable 'ansible_search_path' from source: unknown 29946 1726882593.07347: calling self._execute() 29946 1726882593.07425: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.07428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.07437: variable 'omit' from source: magic vars 29946 1726882593.07697: variable 'ansible_distribution_major_version' from source: facts 29946 1726882593.07707: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882593.07714: variable 'omit' from source: magic vars 29946 1726882593.07730: variable 'omit' from source: magic vars 29946 1726882593.07802: variable 'interface' from source: set_fact 29946 1726882593.07816: variable 'omit' from source: magic vars 29946 1726882593.07847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882593.07877: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882593.07895: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882593.07909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.07919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.07941: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882593.07944: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.07946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.08018: Set connection var ansible_pipelining to False 29946 1726882593.08021: Set connection var ansible_shell_executable to /bin/sh 29946 1726882593.08027: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882593.08032: Set connection var ansible_timeout to 10 29946 1726882593.08038: Set connection var ansible_shell_type to sh 29946 1726882593.08041: Set connection var ansible_connection to ssh 29946 1726882593.08059: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.08062: variable 'ansible_connection' from source: unknown 29946 1726882593.08064: variable 'ansible_module_compression' from source: unknown 29946 1726882593.08069: variable 'ansible_shell_type' from source: unknown 29946 1726882593.08071: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.08073: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.08075: variable 'ansible_pipelining' from source: unknown 29946 1726882593.08077: variable 'ansible_timeout' from source: unknown 29946 1726882593.08089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.08176: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882593.08184: variable 'omit' from source: magic vars 29946 1726882593.08202: starting attempt loop 29946 1726882593.08205: running the handler 29946 1726882593.08305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882593.08463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882593.08494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882593.08554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882593.08580: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882593.08644: variable 'connection_route_rule6' from source: set_fact 29946 1726882593.08663: Evaluated conditional (connection_route_rule6.stdout is search("priority 30600 to 2001:db8::4/32 table 30600")): True 29946 1726882593.08788: variable 'connection_route_rule6' from source: set_fact 29946 1726882593.08806: Evaluated conditional (connection_route_rule6.stdout is search("priority 30601 not from ::/0 dport 128-256 table 30600") or connection_route_rule6.stdout is search("not priority 30601 from ::/0 dport 128-256 table 30600")): True 29946 1726882593.08895: variable 'connection_route_rule6' from source: set_fact 29946 1726882593.08910: Evaluated conditional (connection_route_rule6.stdout is search("priority 30602 from ::/0 table 30600")): True 29946 1726882593.08915: handler run complete 29946 1726882593.08926: attempt loop complete, returning result 29946 1726882593.08928: _execute() done 29946 1726882593.08931: dumping result to json 29946 1726882593.08933: done dumping result, returning 29946 1726882593.08940: done running TaskExecutor() for managed_node2/TASK: Assert that the specified IPv6 routing rule was configured in the connection "ethtest0" [12673a56-9f93-95e7-9dfb-000000000067] 29946 1726882593.08946: sending task result for task 12673a56-9f93-95e7-9dfb-000000000067 29946 1726882593.09025: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000067 29946 1726882593.09028: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882593.09103: no more pending results, returning what we have 29946 1726882593.09107: results queue empty 29946 1726882593.09108: checking for any_errors_fatal 29946 1726882593.09115: done checking for any_errors_fatal 29946 1726882593.09115: checking for max_fail_percentage 29946 1726882593.09117: done checking for max_fail_percentage 29946 1726882593.09118: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.09119: done checking to see if all hosts have failed 29946 1726882593.09119: getting the remaining hosts for this loop 29946 1726882593.09120: done getting the remaining hosts for this loop 29946 1726882593.09123: getting the next task for host managed_node2 29946 1726882593.09128: done getting next task for host managed_node2 29946 1726882593.09131: ^ task is: TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 29946 1726882593.09132: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.09136: getting variables 29946 1726882593.09137: in VariableManager get_vars() 29946 1726882593.09168: Calling all_inventory to load vars for managed_node2 29946 1726882593.09170: Calling groups_inventory to load vars for managed_node2 29946 1726882593.09172: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.09180: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.09183: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.09187: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.10091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.10952: done with get_vars() 29946 1726882593.10966: done getting variables TASK [Remove the dedicated test file in `/etc/iproute2/rt_tables.d/`] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:213 Friday 20 September 2024 21:36:33 -0400 (0:00:00.041) 0:00:19.219 ****** 29946 1726882593.11037: entering _queue_task() for managed_node2/file 29946 1726882593.11284: worker is 1 (out of 1 available) 29946 1726882593.11301: exiting _queue_task() for managed_node2/file 29946 1726882593.11312: done queuing things up, now waiting for results queue to drain 29946 1726882593.11314: waiting for pending results... 29946 1726882593.11710: running TaskExecutor() for managed_node2/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` 29946 1726882593.11715: in run() - task 12673a56-9f93-95e7-9dfb-000000000068 29946 1726882593.11718: variable 'ansible_search_path' from source: unknown 29946 1726882593.11740: calling self._execute() 29946 1726882593.11848: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.11859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.11872: variable 'omit' from source: magic vars 29946 1726882593.12273: variable 'ansible_distribution_major_version' from source: facts 29946 1726882593.12288: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882593.12292: variable 'omit' from source: magic vars 29946 1726882593.12313: variable 'omit' from source: magic vars 29946 1726882593.12337: variable 'omit' from source: magic vars 29946 1726882593.12372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882593.12405: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882593.12423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882593.12441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.12450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.12475: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882593.12478: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.12481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.12550: Set connection var ansible_pipelining to False 29946 1726882593.12553: Set connection var ansible_shell_executable to /bin/sh 29946 1726882593.12559: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882593.12566: Set connection var ansible_timeout to 10 29946 1726882593.12571: Set connection var ansible_shell_type to sh 29946 1726882593.12573: Set connection var ansible_connection to ssh 29946 1726882593.12592: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.12597: variable 'ansible_connection' from source: unknown 29946 1726882593.12600: variable 'ansible_module_compression' from source: unknown 29946 1726882593.12602: variable 'ansible_shell_type' from source: unknown 29946 1726882593.12604: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.12606: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.12609: variable 'ansible_pipelining' from source: unknown 29946 1726882593.12611: variable 'ansible_timeout' from source: unknown 29946 1726882593.12618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.12755: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882593.12764: variable 'omit' from source: magic vars 29946 1726882593.12769: starting attempt loop 29946 1726882593.12773: running the handler 29946 1726882593.12783: _low_level_execute_command(): starting 29946 1726882593.12791: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882593.13271: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.13308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.13312: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882593.13314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882593.13316: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.13366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.13372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.13374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.13442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.15135: stdout chunk (state=3): >>>/root <<< 29946 1726882593.15235: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.15303: stderr chunk (state=3): >>><<< 29946 1726882593.15306: stdout chunk (state=3): >>><<< 29946 1726882593.15310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.15312: _low_level_execute_command(): starting 29946 1726882593.15315: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293 `" && echo ansible-tmp-1726882593.1529663-30853-41461436364293="` echo /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293 `" ) && sleep 0' 29946 1726882593.15919: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.15923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.15925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.15928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.15953: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882593.15963: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882593.15966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.15969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882593.16007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882593.16011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29946 1726882593.16014: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.16017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.16023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.16027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882593.16034: stderr chunk (state=3): >>>debug2: match found <<< 29946 1726882593.16044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.16122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.16125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.16146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.16235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.18234: stdout chunk (state=3): >>>ansible-tmp-1726882593.1529663-30853-41461436364293=/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293 <<< 29946 1726882593.18398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.18402: stdout chunk (state=3): >>><<< 29946 1726882593.18404: stderr chunk (state=3): >>><<< 29946 1726882593.18406: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882593.1529663-30853-41461436364293=/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.18409: variable 'ansible_module_compression' from source: unknown 29946 1726882593.18699: ANSIBALLZ: Using lock for file 29946 1726882593.18703: ANSIBALLZ: Acquiring lock 29946 1726882593.18705: ANSIBALLZ: Lock acquired: 140626579264992 29946 1726882593.18707: ANSIBALLZ: Creating module 29946 1726882593.33270: ANSIBALLZ: Writing module into payload 29946 1726882593.33427: ANSIBALLZ: Writing module 29946 1726882593.33478: ANSIBALLZ: Renaming module 29946 1726882593.33489: ANSIBALLZ: Done creating module 29946 1726882593.33514: variable 'ansible_facts' from source: unknown 29946 1726882593.33621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py 29946 1726882593.33781: Sending initial data 29946 1726882593.33784: Sent initial data (152 bytes) 29946 1726882593.34361: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.34376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.34390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.34491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.34518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.34615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.36243: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882593.36338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882593.36399: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpzabdt842 /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py <<< 29946 1726882593.36458: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py" <<< 29946 1726882593.36545: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpzabdt842" to remote "/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py" <<< 29946 1726882593.37436: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.37445: stdout chunk (state=3): >>><<< 29946 1726882593.37463: stderr chunk (state=3): >>><<< 29946 1726882593.37574: done transferring module to remote 29946 1726882593.37577: _low_level_execute_command(): starting 29946 1726882593.37579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/ /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py && sleep 0' 29946 1726882593.38120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.38137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.38151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.38236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.38267: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.38282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.38304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.38395: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.40273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.40276: stdout chunk (state=3): >>><<< 29946 1726882593.40279: stderr chunk (state=3): >>><<< 29946 1726882593.40300: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.40484: _low_level_execute_command(): starting 29946 1726882593.40487: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/AnsiballZ_file.py && sleep 0' 29946 1726882593.41654: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.41657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882593.41660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.41662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.41664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.41907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.42027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.42089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.57621: stdout chunk (state=3): >>> <<< 29946 1726882593.57653: stdout chunk (state=3): >>>{"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} <<< 29946 1726882593.58881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882593.58912: stdout chunk (state=3): >>><<< 29946 1726882593.58926: stderr chunk (state=3): >>><<< 29946 1726882593.58950: _low_level_execute_command() done: rc=0, stdout= {"path": "/etc/iproute2/rt_tables.d/table.conf", "changed": true, "diff": {"before": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "file"}, "after": {"path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent"}}, "state": "absent", "invocation": {"module_args": {"state": "absent", "path": "/etc/iproute2/rt_tables.d/table.conf", "recurse": false, "force": false, "follow": true, "modification_time_format": "%Y%m%d%H%M.%S", "access_time_format": "%Y%m%d%H%M.%S", "unsafe_writes": false, "_original_basename": null, "_diff_peek": null, "src": null, "modification_time": null, "access_time": null, "mode": null, "owner": null, "group": null, "seuser": null, "serole": null, "selevel": null, "setype": null, "attributes": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882593.58999: done with _execute_module (file, {'state': 'absent', 'path': '/etc/iproute2/rt_tables.d/table.conf', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'file', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882593.59027: _low_level_execute_command(): starting 29946 1726882593.59036: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882593.1529663-30853-41461436364293/ > /dev/null 2>&1 && sleep 0' 29946 1726882593.59658: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.59672: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.59690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.59711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.59730: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882593.59749: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882593.59764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.59782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882593.59862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.59895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.59915: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.59939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.60028: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.61836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.61848: stderr chunk (state=3): >>><<< 29946 1726882593.61852: stdout chunk (state=3): >>><<< 29946 1726882593.61865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.61871: handler run complete 29946 1726882593.61890: attempt loop complete, returning result 29946 1726882593.61892: _execute() done 29946 1726882593.61897: dumping result to json 29946 1726882593.61899: done dumping result, returning 29946 1726882593.61906: done running TaskExecutor() for managed_node2/TASK: Remove the dedicated test file in `/etc/iproute2/rt_tables.d/` [12673a56-9f93-95e7-9dfb-000000000068] 29946 1726882593.61909: sending task result for task 12673a56-9f93-95e7-9dfb-000000000068 29946 1726882593.62008: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000068 29946 1726882593.62011: WORKER PROCESS EXITING changed: [managed_node2] => { "changed": true, "path": "/etc/iproute2/rt_tables.d/table.conf", "state": "absent" } 29946 1726882593.62075: no more pending results, returning what we have 29946 1726882593.62079: results queue empty 29946 1726882593.62080: checking for any_errors_fatal 29946 1726882593.62089: done checking for any_errors_fatal 29946 1726882593.62090: checking for max_fail_percentage 29946 1726882593.62092: done checking for max_fail_percentage 29946 1726882593.62094: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.62095: done checking to see if all hosts have failed 29946 1726882593.62096: getting the remaining hosts for this loop 29946 1726882593.62098: done getting the remaining hosts for this loop 29946 1726882593.62101: getting the next task for host managed_node2 29946 1726882593.62109: done getting next task for host managed_node2 29946 1726882593.62111: ^ task is: TASK: meta (flush_handlers) 29946 1726882593.62113: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.62117: getting variables 29946 1726882593.62119: in VariableManager get_vars() 29946 1726882593.62156: Calling all_inventory to load vars for managed_node2 29946 1726882593.62158: Calling groups_inventory to load vars for managed_node2 29946 1726882593.62160: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.62170: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.62173: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.62175: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.63029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.64467: done with get_vars() 29946 1726882593.64489: done getting variables 29946 1726882593.64537: in VariableManager get_vars() 29946 1726882593.64550: Calling all_inventory to load vars for managed_node2 29946 1726882593.64552: Calling groups_inventory to load vars for managed_node2 29946 1726882593.64554: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.64559: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.64561: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.64564: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.65694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.66602: done with get_vars() 29946 1726882593.66618: done queuing things up, now waiting for results queue to drain 29946 1726882593.66620: results queue empty 29946 1726882593.66620: checking for any_errors_fatal 29946 1726882593.66622: done checking for any_errors_fatal 29946 1726882593.66623: checking for max_fail_percentage 29946 1726882593.66623: done checking for max_fail_percentage 29946 1726882593.66624: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.66624: done checking to see if all hosts have failed 29946 1726882593.66625: getting the remaining hosts for this loop 29946 1726882593.66625: done getting the remaining hosts for this loop 29946 1726882593.66627: getting the next task for host managed_node2 29946 1726882593.66630: done getting next task for host managed_node2 29946 1726882593.66631: ^ task is: TASK: meta (flush_handlers) 29946 1726882593.66632: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.66637: getting variables 29946 1726882593.66637: in VariableManager get_vars() 29946 1726882593.66645: Calling all_inventory to load vars for managed_node2 29946 1726882593.66646: Calling groups_inventory to load vars for managed_node2 29946 1726882593.66648: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.66651: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.66652: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.66654: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.67279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.68581: done with get_vars() 29946 1726882593.68597: done getting variables 29946 1726882593.68630: in VariableManager get_vars() 29946 1726882593.68637: Calling all_inventory to load vars for managed_node2 29946 1726882593.68638: Calling groups_inventory to load vars for managed_node2 29946 1726882593.68640: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.68642: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.68644: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.68645: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.69263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.70106: done with get_vars() 29946 1726882593.70122: done queuing things up, now waiting for results queue to drain 29946 1726882593.70123: results queue empty 29946 1726882593.70124: checking for any_errors_fatal 29946 1726882593.70124: done checking for any_errors_fatal 29946 1726882593.70125: checking for max_fail_percentage 29946 1726882593.70125: done checking for max_fail_percentage 29946 1726882593.70126: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.70126: done checking to see if all hosts have failed 29946 1726882593.70127: getting the remaining hosts for this loop 29946 1726882593.70127: done getting the remaining hosts for this loop 29946 1726882593.70129: getting the next task for host managed_node2 29946 1726882593.70131: done getting next task for host managed_node2 29946 1726882593.70131: ^ task is: None 29946 1726882593.70132: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.70133: done queuing things up, now waiting for results queue to drain 29946 1726882593.70134: results queue empty 29946 1726882593.70134: checking for any_errors_fatal 29946 1726882593.70134: done checking for any_errors_fatal 29946 1726882593.70135: checking for max_fail_percentage 29946 1726882593.70135: done checking for max_fail_percentage 29946 1726882593.70136: checking to see if all hosts have failed and the running result is not ok 29946 1726882593.70136: done checking to see if all hosts have failed 29946 1726882593.70138: getting the next task for host managed_node2 29946 1726882593.70139: done getting next task for host managed_node2 29946 1726882593.70140: ^ task is: None 29946 1726882593.70140: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.70181: in VariableManager get_vars() 29946 1726882593.70200: done with get_vars() 29946 1726882593.70204: in VariableManager get_vars() 29946 1726882593.70212: done with get_vars() 29946 1726882593.70215: variable 'omit' from source: magic vars 29946 1726882593.70298: variable 'profile' from source: play vars 29946 1726882593.70376: in VariableManager get_vars() 29946 1726882593.70386: done with get_vars() 29946 1726882593.70403: variable 'omit' from source: magic vars 29946 1726882593.70449: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 29946 1726882593.70878: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29946 1726882593.70900: getting the remaining hosts for this loop 29946 1726882593.70901: done getting the remaining hosts for this loop 29946 1726882593.70903: getting the next task for host managed_node2 29946 1726882593.70904: done getting next task for host managed_node2 29946 1726882593.70906: ^ task is: TASK: Gathering Facts 29946 1726882593.70907: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882593.70908: getting variables 29946 1726882593.70909: in VariableManager get_vars() 29946 1726882593.70916: Calling all_inventory to load vars for managed_node2 29946 1726882593.70917: Calling groups_inventory to load vars for managed_node2 29946 1726882593.70919: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882593.70922: Calling all_plugins_play to load vars for managed_node2 29946 1726882593.70924: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882593.70925: Calling groups_plugins_play to load vars for managed_node2 29946 1726882593.71660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882593.72497: done with get_vars() 29946 1726882593.72510: done getting variables 29946 1726882593.72537: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Friday 20 September 2024 21:36:33 -0400 (0:00:00.615) 0:00:19.835 ****** 29946 1726882593.72553: entering _queue_task() for managed_node2/gather_facts 29946 1726882593.72777: worker is 1 (out of 1 available) 29946 1726882593.72788: exiting _queue_task() for managed_node2/gather_facts 29946 1726882593.72802: done queuing things up, now waiting for results queue to drain 29946 1726882593.72804: waiting for pending results... 29946 1726882593.73109: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882593.73114: in run() - task 12673a56-9f93-95e7-9dfb-0000000004b1 29946 1726882593.73169: variable 'ansible_search_path' from source: unknown 29946 1726882593.73221: calling self._execute() 29946 1726882593.73333: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.73345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.73367: variable 'omit' from source: magic vars 29946 1726882593.73800: variable 'ansible_distribution_major_version' from source: facts 29946 1726882593.73831: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882593.73847: variable 'omit' from source: magic vars 29946 1726882593.73890: variable 'omit' from source: magic vars 29946 1726882593.73920: variable 'omit' from source: magic vars 29946 1726882593.73951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882593.73978: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882593.73997: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882593.74023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.74030: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882593.74053: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882593.74056: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.74058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.74130: Set connection var ansible_pipelining to False 29946 1726882593.74135: Set connection var ansible_shell_executable to /bin/sh 29946 1726882593.74138: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882593.74144: Set connection var ansible_timeout to 10 29946 1726882593.74151: Set connection var ansible_shell_type to sh 29946 1726882593.74153: Set connection var ansible_connection to ssh 29946 1726882593.74169: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.74173: variable 'ansible_connection' from source: unknown 29946 1726882593.74176: variable 'ansible_module_compression' from source: unknown 29946 1726882593.74179: variable 'ansible_shell_type' from source: unknown 29946 1726882593.74181: variable 'ansible_shell_executable' from source: unknown 29946 1726882593.74184: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882593.74188: variable 'ansible_pipelining' from source: unknown 29946 1726882593.74191: variable 'ansible_timeout' from source: unknown 29946 1726882593.74195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882593.74321: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882593.74330: variable 'omit' from source: magic vars 29946 1726882593.74335: starting attempt loop 29946 1726882593.74340: running the handler 29946 1726882593.74354: variable 'ansible_facts' from source: unknown 29946 1726882593.74371: _low_level_execute_command(): starting 29946 1726882593.74377: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882593.74847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.74850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.74853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.74856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.74902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.74920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.74971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.76625: stdout chunk (state=3): >>>/root <<< 29946 1726882593.76799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.76802: stdout chunk (state=3): >>><<< 29946 1726882593.76804: stderr chunk (state=3): >>><<< 29946 1726882593.76809: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.76811: _low_level_execute_command(): starting 29946 1726882593.76814: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607 `" && echo ansible-tmp-1726882593.7673044-30880-94727518004607="` echo /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607 `" ) && sleep 0' 29946 1726882593.77357: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.77376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.77396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.77412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.77429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882593.77489: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.77559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.77562: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.77597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.77679: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.79541: stdout chunk (state=3): >>>ansible-tmp-1726882593.7673044-30880-94727518004607=/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607 <<< 29946 1726882593.79694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.79698: stdout chunk (state=3): >>><<< 29946 1726882593.79700: stderr chunk (state=3): >>><<< 29946 1726882593.79899: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882593.7673044-30880-94727518004607=/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.79902: variable 'ansible_module_compression' from source: unknown 29946 1726882593.79904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882593.79907: variable 'ansible_facts' from source: unknown 29946 1726882593.80084: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py 29946 1726882593.80259: Sending initial data 29946 1726882593.80268: Sent initial data (153 bytes) 29946 1726882593.80867: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.80882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.80995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.81021: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.81038: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.81129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.82648: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 29946 1726882593.82657: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 29946 1726882593.82676: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882593.82737: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882593.82805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpl11m2uff /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py <<< 29946 1726882593.82808: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py" <<< 29946 1726882593.82861: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpl11m2uff" to remote "/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py" <<< 29946 1726882593.82864: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py" <<< 29946 1726882593.84166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.84169: stdout chunk (state=3): >>><<< 29946 1726882593.84172: stderr chunk (state=3): >>><<< 29946 1726882593.84174: done transferring module to remote 29946 1726882593.84175: _low_level_execute_command(): starting 29946 1726882593.84177: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/ /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py && sleep 0' 29946 1726882593.84649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.84661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882593.84682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.84726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.84738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.84803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882593.86656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882593.86660: stdout chunk (state=3): >>><<< 29946 1726882593.86662: stderr chunk (state=3): >>><<< 29946 1726882593.86665: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882593.86668: _low_level_execute_command(): starting 29946 1726882593.86670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/AnsiballZ_setup.py && sleep 0' 29946 1726882593.87172: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882593.87189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882593.87206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882593.87225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882593.87312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882593.87343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882593.87365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882593.87384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882593.87484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882595.52908: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "34", "epoch": "1726882594", "epoch_int": "1726882594", "date": "2024-09-20", "time": "21:36:34", "iso8601_micro": "2024-09-21T01:36:34.149172Z", "iso8601": "2024-09-21T01:36:34Z", "iso8601_basic": "20240920T213634149172", "iso8601_basic_short": "20240920T213634", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.634765625, "5m": 0.529296875, "15m": 0.2978515625}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo", "peerethtest0", "ethtest0"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f2:06:aa:e8:e0:af", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::f006:aaff:fee8:e0af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "aa:8b:67:39:99:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::a88b:67ff:fe39:99d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmen<<< 29946 1726882595.52925: stdout chunk (state=3): >>>tation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["198.51.100.3", "10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["2001:db8::2", "fe80::f006:aaff:fee8:e0af", "fe80::a88b:67ff:fe39:99d2", "fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::8ff:c1ff:fe46:633b", "fe80::a88b:67ff:fe39:99d2", "fe80::f006:aaff:fee8:e0af"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["48538<<< 29946 1726882595.52929: stdout chunk (state=3): >>>57d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 785, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882595.54673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882595.54723: stderr chunk (state=3): >>><<< 29946 1726882595.54739: stdout chunk (state=3): >>><<< 29946 1726882595.54907: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "34", "epoch": "1726882594", "epoch_int": "1726882594", "date": "2024-09-20", "time": "21:36:34", "iso8601_micro": "2024-09-21T01:36:34.149172Z", "iso8601": "2024-09-21T01:36:34Z", "iso8601_basic": "20240920T213634149172", "iso8601_basic_short": "20240920T213634", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.634765625, "5m": 0.529296875, "15m": 0.2978515625}, "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["rpltstbr", "eth0", "lo", "peerethtest0", "ethtest0"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f2:06:aa:e8:e0:af", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "198.51.100.3", "broadcast": "198.51.100.63", "netmask": "255.255.255.192", "network": "198.51.100.0", "prefix": "26"}, "ipv6": [{"address": "2001:db8::2", "prefix": "32", "scope": "global"}, {"address": "fe80::f006:aaff:fee8:e0af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "aa:8b:67:39:99:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::a88b:67ff:fe39:99d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["198.51.100.3", "10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["2001:db8::2", "fe80::f006:aaff:fee8:e0af", "fe80::a88b:67ff:fe39:99d2", "fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72", "198.51.100.3"], "ipv6": ["::1", "2001:db8::2", "fe80::8ff:c1ff:fe46:633b", "fe80::a88b:67ff:fe39:99d2", "fe80::f006:aaff:fee8:e0af"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2940, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 591, "free": 2940}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 785, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882595.56104: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882595.56227: _low_level_execute_command(): starting 29946 1726882595.56231: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882593.7673044-30880-94727518004607/ > /dev/null 2>&1 && sleep 0' 29946 1726882595.57933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882595.57947: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882595.57961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882595.57979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882595.58008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882595.58022: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882595.58036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882595.58054: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882595.58068: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882595.58112: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882595.58172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882595.58217: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882595.58310: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882595.60299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882595.60303: stdout chunk (state=3): >>><<< 29946 1726882595.60306: stderr chunk (state=3): >>><<< 29946 1726882595.60676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882595.60680: handler run complete 29946 1726882595.60713: variable 'ansible_facts' from source: unknown 29946 1726882595.60848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.61443: variable 'ansible_facts' from source: unknown 29946 1726882595.61620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.61981: attempt loop complete, returning result 29946 1726882595.62042: _execute() done 29946 1726882595.62050: dumping result to json 29946 1726882595.62198: done dumping result, returning 29946 1726882595.62201: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-0000000004b1] 29946 1726882595.62204: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004b1 29946 1726882595.63241: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004b1 29946 1726882595.63244: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882595.63827: no more pending results, returning what we have 29946 1726882595.63830: results queue empty 29946 1726882595.63831: checking for any_errors_fatal 29946 1726882595.63832: done checking for any_errors_fatal 29946 1726882595.63833: checking for max_fail_percentage 29946 1726882595.63835: done checking for max_fail_percentage 29946 1726882595.63835: checking to see if all hosts have failed and the running result is not ok 29946 1726882595.63836: done checking to see if all hosts have failed 29946 1726882595.63837: getting the remaining hosts for this loop 29946 1726882595.63838: done getting the remaining hosts for this loop 29946 1726882595.63841: getting the next task for host managed_node2 29946 1726882595.63846: done getting next task for host managed_node2 29946 1726882595.63848: ^ task is: TASK: meta (flush_handlers) 29946 1726882595.63850: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882595.63854: getting variables 29946 1726882595.63996: in VariableManager get_vars() 29946 1726882595.64026: Calling all_inventory to load vars for managed_node2 29946 1726882595.64029: Calling groups_inventory to load vars for managed_node2 29946 1726882595.64032: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882595.64041: Calling all_plugins_play to load vars for managed_node2 29946 1726882595.64044: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882595.64047: Calling groups_plugins_play to load vars for managed_node2 29946 1726882595.65724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.67868: done with get_vars() 29946 1726882595.67899: done getting variables 29946 1726882595.67973: in VariableManager get_vars() 29946 1726882595.67987: Calling all_inventory to load vars for managed_node2 29946 1726882595.67989: Calling groups_inventory to load vars for managed_node2 29946 1726882595.67992: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882595.68000: Calling all_plugins_play to load vars for managed_node2 29946 1726882595.68003: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882595.68006: Calling groups_plugins_play to load vars for managed_node2 29946 1726882595.69221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.72257: done with get_vars() 29946 1726882595.72288: done queuing things up, now waiting for results queue to drain 29946 1726882595.72409: results queue empty 29946 1726882595.72410: checking for any_errors_fatal 29946 1726882595.72415: done checking for any_errors_fatal 29946 1726882595.72416: checking for max_fail_percentage 29946 1726882595.72417: done checking for max_fail_percentage 29946 1726882595.72418: checking to see if all hosts have failed and the running result is not ok 29946 1726882595.72419: done checking to see if all hosts have failed 29946 1726882595.72419: getting the remaining hosts for this loop 29946 1726882595.72420: done getting the remaining hosts for this loop 29946 1726882595.72423: getting the next task for host managed_node2 29946 1726882595.72428: done getting next task for host managed_node2 29946 1726882595.72431: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882595.72433: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882595.72443: getting variables 29946 1726882595.72444: in VariableManager get_vars() 29946 1726882595.72459: Calling all_inventory to load vars for managed_node2 29946 1726882595.72461: Calling groups_inventory to load vars for managed_node2 29946 1726882595.72463: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882595.72468: Calling all_plugins_play to load vars for managed_node2 29946 1726882595.72470: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882595.72473: Calling groups_plugins_play to load vars for managed_node2 29946 1726882595.74976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.78471: done with get_vars() 29946 1726882595.78607: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:35 -0400 (0:00:02.061) 0:00:21.896 ****** 29946 1726882595.78682: entering _queue_task() for managed_node2/include_tasks 29946 1726882595.79440: worker is 1 (out of 1 available) 29946 1726882595.79705: exiting _queue_task() for managed_node2/include_tasks 29946 1726882595.79716: done queuing things up, now waiting for results queue to drain 29946 1726882595.79717: waiting for pending results... 29946 1726882595.80709: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882595.80715: in run() - task 12673a56-9f93-95e7-9dfb-000000000071 29946 1726882595.80940: variable 'ansible_search_path' from source: unknown 29946 1726882595.80944: variable 'ansible_search_path' from source: unknown 29946 1726882595.80946: calling self._execute() 29946 1726882595.81034: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882595.81108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882595.81122: variable 'omit' from source: magic vars 29946 1726882595.81902: variable 'ansible_distribution_major_version' from source: facts 29946 1726882595.81925: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882595.81936: _execute() done 29946 1726882595.81943: dumping result to json 29946 1726882595.81950: done dumping result, returning 29946 1726882595.82134: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-95e7-9dfb-000000000071] 29946 1726882595.82137: sending task result for task 12673a56-9f93-95e7-9dfb-000000000071 29946 1726882595.82212: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000071 29946 1726882595.82216: WORKER PROCESS EXITING 29946 1726882595.82258: no more pending results, returning what we have 29946 1726882595.82263: in VariableManager get_vars() 29946 1726882595.82311: Calling all_inventory to load vars for managed_node2 29946 1726882595.82314: Calling groups_inventory to load vars for managed_node2 29946 1726882595.82317: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882595.82330: Calling all_plugins_play to load vars for managed_node2 29946 1726882595.82334: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882595.82337: Calling groups_plugins_play to load vars for managed_node2 29946 1726882595.86223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882595.89986: done with get_vars() 29946 1726882595.90417: variable 'ansible_search_path' from source: unknown 29946 1726882595.90419: variable 'ansible_search_path' from source: unknown 29946 1726882595.90449: we have included files to process 29946 1726882595.90450: generating all_blocks data 29946 1726882595.90452: done generating all_blocks data 29946 1726882595.90453: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882595.90454: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882595.90456: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882595.92260: done processing included file 29946 1726882595.92262: iterating over new_blocks loaded from include file 29946 1726882595.92263: in VariableManager get_vars() 29946 1726882595.92282: done with get_vars() 29946 1726882595.92284: filtering new block on tags 29946 1726882595.92303: done filtering new block on tags 29946 1726882595.92306: in VariableManager get_vars() 29946 1726882595.92325: done with get_vars() 29946 1726882595.92327: filtering new block on tags 29946 1726882595.92346: done filtering new block on tags 29946 1726882595.92349: in VariableManager get_vars() 29946 1726882595.92367: done with get_vars() 29946 1726882595.92369: filtering new block on tags 29946 1726882595.92385: done filtering new block on tags 29946 1726882595.92387: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 29946 1726882595.92795: extending task lists for all hosts with included blocks 29946 1726882595.93575: done extending task lists 29946 1726882595.93576: done processing included files 29946 1726882595.93577: results queue empty 29946 1726882595.93578: checking for any_errors_fatal 29946 1726882595.93579: done checking for any_errors_fatal 29946 1726882595.93580: checking for max_fail_percentage 29946 1726882595.93581: done checking for max_fail_percentage 29946 1726882595.93582: checking to see if all hosts have failed and the running result is not ok 29946 1726882595.93583: done checking to see if all hosts have failed 29946 1726882595.93583: getting the remaining hosts for this loop 29946 1726882595.93585: done getting the remaining hosts for this loop 29946 1726882595.93587: getting the next task for host managed_node2 29946 1726882595.93591: done getting next task for host managed_node2 29946 1726882595.93797: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882595.93800: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882595.93809: getting variables 29946 1726882595.93810: in VariableManager get_vars() 29946 1726882595.93823: Calling all_inventory to load vars for managed_node2 29946 1726882595.93826: Calling groups_inventory to load vars for managed_node2 29946 1726882595.93828: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882595.93833: Calling all_plugins_play to load vars for managed_node2 29946 1726882595.93836: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882595.93839: Calling groups_plugins_play to load vars for managed_node2 29946 1726882596.04108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882596.05701: done with get_vars() 29946 1726882596.05726: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:36 -0400 (0:00:00.271) 0:00:22.167 ****** 29946 1726882596.05805: entering _queue_task() for managed_node2/setup 29946 1726882596.06174: worker is 1 (out of 1 available) 29946 1726882596.06187: exiting _queue_task() for managed_node2/setup 29946 1726882596.06201: done queuing things up, now waiting for results queue to drain 29946 1726882596.06202: waiting for pending results... 29946 1726882596.06531: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882596.06639: in run() - task 12673a56-9f93-95e7-9dfb-0000000004f2 29946 1726882596.06661: variable 'ansible_search_path' from source: unknown 29946 1726882596.06669: variable 'ansible_search_path' from source: unknown 29946 1726882596.06710: calling self._execute() 29946 1726882596.06816: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.06831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.06852: variable 'omit' from source: magic vars 29946 1726882596.07225: variable 'ansible_distribution_major_version' from source: facts 29946 1726882596.07244: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882596.07462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882596.09637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882596.10003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882596.10007: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882596.10109: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882596.10113: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882596.10300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882596.10304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882596.10307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882596.10354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882596.10427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882596.10489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882596.10580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882596.10652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882596.10776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882596.10798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882596.11117: variable '__network_required_facts' from source: role '' defaults 29946 1726882596.11183: variable 'ansible_facts' from source: unknown 29946 1726882596.12965: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29946 1726882596.13199: when evaluation is False, skipping this task 29946 1726882596.13203: _execute() done 29946 1726882596.13206: dumping result to json 29946 1726882596.13208: done dumping result, returning 29946 1726882596.13211: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-95e7-9dfb-0000000004f2] 29946 1726882596.13214: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f2 29946 1726882596.13288: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f2 29946 1726882596.13292: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882596.13340: no more pending results, returning what we have 29946 1726882596.13344: results queue empty 29946 1726882596.13345: checking for any_errors_fatal 29946 1726882596.13347: done checking for any_errors_fatal 29946 1726882596.13348: checking for max_fail_percentage 29946 1726882596.13350: done checking for max_fail_percentage 29946 1726882596.13351: checking to see if all hosts have failed and the running result is not ok 29946 1726882596.13352: done checking to see if all hosts have failed 29946 1726882596.13353: getting the remaining hosts for this loop 29946 1726882596.13354: done getting the remaining hosts for this loop 29946 1726882596.13358: getting the next task for host managed_node2 29946 1726882596.13369: done getting next task for host managed_node2 29946 1726882596.13374: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882596.13377: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882596.13391: getting variables 29946 1726882596.13395: in VariableManager get_vars() 29946 1726882596.13441: Calling all_inventory to load vars for managed_node2 29946 1726882596.13444: Calling groups_inventory to load vars for managed_node2 29946 1726882596.13447: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882596.13459: Calling all_plugins_play to load vars for managed_node2 29946 1726882596.13462: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882596.13465: Calling groups_plugins_play to load vars for managed_node2 29946 1726882596.16585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882596.20251: done with get_vars() 29946 1726882596.20274: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:36 -0400 (0:00:00.146) 0:00:22.314 ****** 29946 1726882596.20483: entering _queue_task() for managed_node2/stat 29946 1726882596.21154: worker is 1 (out of 1 available) 29946 1726882596.21166: exiting _queue_task() for managed_node2/stat 29946 1726882596.21177: done queuing things up, now waiting for results queue to drain 29946 1726882596.21179: waiting for pending results... 29946 1726882596.21914: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882596.22046: in run() - task 12673a56-9f93-95e7-9dfb-0000000004f4 29946 1726882596.22230: variable 'ansible_search_path' from source: unknown 29946 1726882596.22233: variable 'ansible_search_path' from source: unknown 29946 1726882596.22237: calling self._execute() 29946 1726882596.22386: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.22459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.22474: variable 'omit' from source: magic vars 29946 1726882596.23198: variable 'ansible_distribution_major_version' from source: facts 29946 1726882596.23429: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882596.23738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882596.24310: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882596.24314: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882596.24389: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882596.24561: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882596.24647: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882596.24724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882596.24873: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882596.24905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882596.25001: variable '__network_is_ostree' from source: set_fact 29946 1726882596.25083: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882596.25094: when evaluation is False, skipping this task 29946 1726882596.25101: _execute() done 29946 1726882596.25108: dumping result to json 29946 1726882596.25115: done dumping result, returning 29946 1726882596.25125: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-95e7-9dfb-0000000004f4] 29946 1726882596.25134: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f4 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882596.25337: no more pending results, returning what we have 29946 1726882596.25341: results queue empty 29946 1726882596.25342: checking for any_errors_fatal 29946 1726882596.25347: done checking for any_errors_fatal 29946 1726882596.25347: checking for max_fail_percentage 29946 1726882596.25349: done checking for max_fail_percentage 29946 1726882596.25350: checking to see if all hosts have failed and the running result is not ok 29946 1726882596.25351: done checking to see if all hosts have failed 29946 1726882596.25352: getting the remaining hosts for this loop 29946 1726882596.25354: done getting the remaining hosts for this loop 29946 1726882596.25358: getting the next task for host managed_node2 29946 1726882596.25364: done getting next task for host managed_node2 29946 1726882596.25367: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882596.25370: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882596.25385: getting variables 29946 1726882596.25387: in VariableManager get_vars() 29946 1726882596.25431: Calling all_inventory to load vars for managed_node2 29946 1726882596.25434: Calling groups_inventory to load vars for managed_node2 29946 1726882596.25436: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882596.25447: Calling all_plugins_play to load vars for managed_node2 29946 1726882596.25450: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882596.25453: Calling groups_plugins_play to load vars for managed_node2 29946 1726882596.26226: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f4 29946 1726882596.26230: WORKER PROCESS EXITING 29946 1726882596.28656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882596.32418: done with get_vars() 29946 1726882596.32442: done getting variables 29946 1726882596.32502: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:36 -0400 (0:00:00.120) 0:00:22.434 ****** 29946 1726882596.32539: entering _queue_task() for managed_node2/set_fact 29946 1726882596.33805: worker is 1 (out of 1 available) 29946 1726882596.33816: exiting _queue_task() for managed_node2/set_fact 29946 1726882596.33828: done queuing things up, now waiting for results queue to drain 29946 1726882596.33829: waiting for pending results... 29946 1726882596.34328: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882596.34488: in run() - task 12673a56-9f93-95e7-9dfb-0000000004f5 29946 1726882596.34753: variable 'ansible_search_path' from source: unknown 29946 1726882596.34756: variable 'ansible_search_path' from source: unknown 29946 1726882596.34759: calling self._execute() 29946 1726882596.34857: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.35083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.35090: variable 'omit' from source: magic vars 29946 1726882596.35677: variable 'ansible_distribution_major_version' from source: facts 29946 1726882596.35749: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882596.36097: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882596.36700: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882596.36748: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882596.36852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882596.37099: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882596.37131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882596.37359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882596.37363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882596.37366: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882596.37521: variable '__network_is_ostree' from source: set_fact 29946 1726882596.37531: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882596.37537: when evaluation is False, skipping this task 29946 1726882596.37543: _execute() done 29946 1726882596.37549: dumping result to json 29946 1726882596.37559: done dumping result, returning 29946 1726882596.37584: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-95e7-9dfb-0000000004f5] 29946 1726882596.37599: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f5 29946 1726882596.37857: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f5 29946 1726882596.37860: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882596.37927: no more pending results, returning what we have 29946 1726882596.37931: results queue empty 29946 1726882596.37932: checking for any_errors_fatal 29946 1726882596.37939: done checking for any_errors_fatal 29946 1726882596.37940: checking for max_fail_percentage 29946 1726882596.37941: done checking for max_fail_percentage 29946 1726882596.37942: checking to see if all hosts have failed and the running result is not ok 29946 1726882596.37944: done checking to see if all hosts have failed 29946 1726882596.37945: getting the remaining hosts for this loop 29946 1726882596.37946: done getting the remaining hosts for this loop 29946 1726882596.37950: getting the next task for host managed_node2 29946 1726882596.37960: done getting next task for host managed_node2 29946 1726882596.37964: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882596.37967: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882596.37981: getting variables 29946 1726882596.37983: in VariableManager get_vars() 29946 1726882596.38021: Calling all_inventory to load vars for managed_node2 29946 1726882596.38024: Calling groups_inventory to load vars for managed_node2 29946 1726882596.38026: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882596.38036: Calling all_plugins_play to load vars for managed_node2 29946 1726882596.38038: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882596.38040: Calling groups_plugins_play to load vars for managed_node2 29946 1726882596.41780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882596.45564: done with get_vars() 29946 1726882596.45587: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:36 -0400 (0:00:00.133) 0:00:22.568 ****** 29946 1726882596.45871: entering _queue_task() for managed_node2/service_facts 29946 1726882596.46550: worker is 1 (out of 1 available) 29946 1726882596.46562: exiting _queue_task() for managed_node2/service_facts 29946 1726882596.46573: done queuing things up, now waiting for results queue to drain 29946 1726882596.46575: waiting for pending results... 29946 1726882596.47128: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882596.47300: in run() - task 12673a56-9f93-95e7-9dfb-0000000004f7 29946 1726882596.47354: variable 'ansible_search_path' from source: unknown 29946 1726882596.47358: variable 'ansible_search_path' from source: unknown 29946 1726882596.47395: calling self._execute() 29946 1726882596.47666: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.47701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.47705: variable 'omit' from source: magic vars 29946 1726882596.48545: variable 'ansible_distribution_major_version' from source: facts 29946 1726882596.48557: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882596.48665: variable 'omit' from source: magic vars 29946 1726882596.48678: variable 'omit' from source: magic vars 29946 1726882596.49201: variable 'omit' from source: magic vars 29946 1726882596.49205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882596.49208: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882596.49224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882596.49246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882596.49323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882596.49355: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882596.49423: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.49432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.49853: Set connection var ansible_pipelining to False 29946 1726882596.49856: Set connection var ansible_shell_executable to /bin/sh 29946 1726882596.49859: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882596.49865: Set connection var ansible_timeout to 10 29946 1726882596.49879: Set connection var ansible_shell_type to sh 29946 1726882596.49889: Set connection var ansible_connection to ssh 29946 1726882596.49985: variable 'ansible_shell_executable' from source: unknown 29946 1726882596.49999: variable 'ansible_connection' from source: unknown 29946 1726882596.50317: variable 'ansible_module_compression' from source: unknown 29946 1726882596.50320: variable 'ansible_shell_type' from source: unknown 29946 1726882596.50322: variable 'ansible_shell_executable' from source: unknown 29946 1726882596.50324: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882596.50326: variable 'ansible_pipelining' from source: unknown 29946 1726882596.50328: variable 'ansible_timeout' from source: unknown 29946 1726882596.50330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882596.50644: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882596.50721: variable 'omit' from source: magic vars 29946 1726882596.50724: starting attempt loop 29946 1726882596.50727: running the handler 29946 1726882596.50730: _low_level_execute_command(): starting 29946 1726882596.50733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882596.52117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882596.52156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882596.52180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882596.52219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882596.52406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882596.53981: stdout chunk (state=3): >>>/root <<< 29946 1726882596.54243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882596.54259: stdout chunk (state=3): >>><<< 29946 1726882596.54271: stderr chunk (state=3): >>><<< 29946 1726882596.54300: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882596.54319: _low_level_execute_command(): starting 29946 1726882596.54332: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690 `" && echo ansible-tmp-1726882596.5430684-30990-25864520366690="` echo /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690 `" ) && sleep 0' 29946 1726882596.55483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882596.55620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882596.55665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882596.55782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882596.55878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882596.55954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882596.55966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882596.56041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882596.56141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882596.58047: stdout chunk (state=3): >>>ansible-tmp-1726882596.5430684-30990-25864520366690=/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690 <<< 29946 1726882596.58196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882596.58207: stdout chunk (state=3): >>><<< 29946 1726882596.58219: stderr chunk (state=3): >>><<< 29946 1726882596.58242: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882596.5430684-30990-25864520366690=/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882596.58304: variable 'ansible_module_compression' from source: unknown 29946 1726882596.58357: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 29946 1726882596.58576: variable 'ansible_facts' from source: unknown 29946 1726882596.58580: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py 29946 1726882596.58723: Sending initial data 29946 1726882596.58732: Sent initial data (161 bytes) 29946 1726882596.59353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882596.59366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882596.59428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882596.59450: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882596.59519: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882596.59799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882596.61305: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882596.61343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882596.61403: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpnqfgm2h3 /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py <<< 29946 1726882596.61413: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py" <<< 29946 1726882596.61470: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpnqfgm2h3" to remote "/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py" <<< 29946 1726882596.62864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882596.62979: stderr chunk (state=3): >>><<< 29946 1726882596.63109: stdout chunk (state=3): >>><<< 29946 1726882596.63113: done transferring module to remote 29946 1726882596.63231: _low_level_execute_command(): starting 29946 1726882596.63236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/ /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py && sleep 0' 29946 1726882596.64608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882596.64941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882596.65292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882596.65299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882596.65440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882596.67399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882596.67403: stderr chunk (state=3): >>><<< 29946 1726882596.67405: stdout chunk (state=3): >>><<< 29946 1726882596.67408: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882596.67413: _low_level_execute_command(): starting 29946 1726882596.67416: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/AnsiballZ_service_facts.py && sleep 0' 29946 1726882596.68866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882596.68881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882596.68979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.18930: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 29946 1726882598.18968: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 29946 1726882598.18999: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 29946 1726882598.19016: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29946 1726882598.20561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882598.20564: stdout chunk (state=3): >>><<< 29946 1726882598.20572: stderr chunk (state=3): >>><<< 29946 1726882598.20611: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882598.22398: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882598.22402: _low_level_execute_command(): starting 29946 1726882598.22405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882596.5430684-30990-25864520366690/ > /dev/null 2>&1 && sleep 0' 29946 1726882598.23324: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882598.23590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882598.23619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.23707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.25514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882598.25616: stdout chunk (state=3): >>><<< 29946 1726882598.25619: stderr chunk (state=3): >>><<< 29946 1726882598.25631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882598.25642: handler run complete 29946 1726882598.26047: variable 'ansible_facts' from source: unknown 29946 1726882598.26599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882598.27579: variable 'ansible_facts' from source: unknown 29946 1726882598.27897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882598.28282: attempt loop complete, returning result 29946 1726882598.28356: _execute() done 29946 1726882598.28373: dumping result to json 29946 1726882598.28573: done dumping result, returning 29946 1726882598.28647: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-95e7-9dfb-0000000004f7] 29946 1726882598.28657: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f7 29946 1726882598.30400: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f7 29946 1726882598.30403: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882598.30492: no more pending results, returning what we have 29946 1726882598.30497: results queue empty 29946 1726882598.30498: checking for any_errors_fatal 29946 1726882598.30504: done checking for any_errors_fatal 29946 1726882598.30505: checking for max_fail_percentage 29946 1726882598.30507: done checking for max_fail_percentage 29946 1726882598.30508: checking to see if all hosts have failed and the running result is not ok 29946 1726882598.30508: done checking to see if all hosts have failed 29946 1726882598.30509: getting the remaining hosts for this loop 29946 1726882598.30511: done getting the remaining hosts for this loop 29946 1726882598.30514: getting the next task for host managed_node2 29946 1726882598.30521: done getting next task for host managed_node2 29946 1726882598.30524: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882598.30527: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882598.30538: getting variables 29946 1726882598.30540: in VariableManager get_vars() 29946 1726882598.30573: Calling all_inventory to load vars for managed_node2 29946 1726882598.30575: Calling groups_inventory to load vars for managed_node2 29946 1726882598.30578: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882598.30589: Calling all_plugins_play to load vars for managed_node2 29946 1726882598.30592: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882598.30973: Calling groups_plugins_play to load vars for managed_node2 29946 1726882598.32569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882598.35274: done with get_vars() 29946 1726882598.35307: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:38 -0400 (0:00:01.895) 0:00:24.463 ****** 29946 1726882598.35413: entering _queue_task() for managed_node2/package_facts 29946 1726882598.35840: worker is 1 (out of 1 available) 29946 1726882598.35852: exiting _queue_task() for managed_node2/package_facts 29946 1726882598.35864: done queuing things up, now waiting for results queue to drain 29946 1726882598.35866: waiting for pending results... 29946 1726882598.36217: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882598.36474: in run() - task 12673a56-9f93-95e7-9dfb-0000000004f8 29946 1726882598.36478: variable 'ansible_search_path' from source: unknown 29946 1726882598.36480: variable 'ansible_search_path' from source: unknown 29946 1726882598.36486: calling self._execute() 29946 1726882598.36692: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882598.36781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882598.36973: variable 'omit' from source: magic vars 29946 1726882598.37675: variable 'ansible_distribution_major_version' from source: facts 29946 1726882598.37734: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882598.37746: variable 'omit' from source: magic vars 29946 1726882598.37936: variable 'omit' from source: magic vars 29946 1726882598.37968: variable 'omit' from source: magic vars 29946 1726882598.38011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882598.38263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882598.38266: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882598.38269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882598.38271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882598.38378: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882598.38388: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882598.38478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882598.38618: Set connection var ansible_pipelining to False 29946 1726882598.38628: Set connection var ansible_shell_executable to /bin/sh 29946 1726882598.38637: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882598.38646: Set connection var ansible_timeout to 10 29946 1726882598.38656: Set connection var ansible_shell_type to sh 29946 1726882598.38662: Set connection var ansible_connection to ssh 29946 1726882598.38687: variable 'ansible_shell_executable' from source: unknown 29946 1726882598.38704: variable 'ansible_connection' from source: unknown 29946 1726882598.38711: variable 'ansible_module_compression' from source: unknown 29946 1726882598.38717: variable 'ansible_shell_type' from source: unknown 29946 1726882598.38901: variable 'ansible_shell_executable' from source: unknown 29946 1726882598.38904: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882598.38906: variable 'ansible_pipelining' from source: unknown 29946 1726882598.38910: variable 'ansible_timeout' from source: unknown 29946 1726882598.38913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882598.39216: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882598.39298: variable 'omit' from source: magic vars 29946 1726882598.39301: starting attempt loop 29946 1726882598.39303: running the handler 29946 1726882598.39305: _low_level_execute_command(): starting 29946 1726882598.39307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882598.40968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882598.41127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.41208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.42808: stdout chunk (state=3): >>>/root <<< 29946 1726882598.42937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882598.43036: stderr chunk (state=3): >>><<< 29946 1726882598.43057: stdout chunk (state=3): >>><<< 29946 1726882598.43134: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882598.43142: _low_level_execute_command(): starting 29946 1726882598.43145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302 `" && echo ansible-tmp-1726882598.430776-31074-78098965042302="` echo /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302 `" ) && sleep 0' 29946 1726882598.43763: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882598.43782: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882598.43808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882598.43908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882598.43936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882598.43954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882598.43974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.44140: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.46073: stdout chunk (state=3): >>>ansible-tmp-1726882598.430776-31074-78098965042302=/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302 <<< 29946 1726882598.46191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882598.46217: stderr chunk (state=3): >>><<< 29946 1726882598.46221: stdout chunk (state=3): >>><<< 29946 1726882598.46392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882598.430776-31074-78098965042302=/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882598.46397: variable 'ansible_module_compression' from source: unknown 29946 1726882598.46501: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 29946 1726882598.46630: variable 'ansible_facts' from source: unknown 29946 1726882598.47056: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py 29946 1726882598.47319: Sending initial data 29946 1726882598.47455: Sent initial data (160 bytes) 29946 1726882598.48926: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882598.49052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.49113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.50648: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882598.50702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882598.50775: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpleeo0d21 /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py <<< 29946 1726882598.50778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py" <<< 29946 1726882598.50898: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpleeo0d21" to remote "/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py" <<< 29946 1726882598.54099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882598.54103: stdout chunk (state=3): >>><<< 29946 1726882598.54154: stderr chunk (state=3): >>><<< 29946 1726882598.54352: done transferring module to remote 29946 1726882598.54356: _low_level_execute_command(): starting 29946 1726882598.54358: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/ /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py && sleep 0' 29946 1726882598.55474: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882598.55486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882598.55502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882598.55572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882598.55583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882598.55687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.55777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882598.57522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882598.57526: stdout chunk (state=3): >>><<< 29946 1726882598.57533: stderr chunk (state=3): >>><<< 29946 1726882598.57545: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882598.57553: _low_level_execute_command(): starting 29946 1726882598.57563: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/AnsiballZ_package_facts.py && sleep 0' 29946 1726882598.58409: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882598.58512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882598.58533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882598.58600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882599.02107: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 29946 1726882599.02245: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 29946 1726882599.02304: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29946 1726882599.03999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882599.04003: stdout chunk (state=3): >>><<< 29946 1726882599.04027: stderr chunk (state=3): >>><<< 29946 1726882599.04084: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882599.06599: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882599.06604: _low_level_execute_command(): starting 29946 1726882599.06613: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882598.430776-31074-78098965042302/ > /dev/null 2>&1 && sleep 0' 29946 1726882599.07500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882599.07518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882599.07571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882599.07676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882599.07698: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882599.07722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882599.07889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882599.09824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882599.09828: stderr chunk (state=3): >>><<< 29946 1726882599.09830: stdout chunk (state=3): >>><<< 29946 1726882599.09948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882599.09952: handler run complete 29946 1726882599.11625: variable 'ansible_facts' from source: unknown 29946 1726882599.12299: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.14800: variable 'ansible_facts' from source: unknown 29946 1726882599.15373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.16153: attempt loop complete, returning result 29946 1726882599.16171: _execute() done 29946 1726882599.16200: dumping result to json 29946 1726882599.16413: done dumping result, returning 29946 1726882599.16430: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-95e7-9dfb-0000000004f8] 29946 1726882599.16451: sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f8 29946 1726882599.18892: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000004f8 29946 1726882599.18899: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882599.19061: no more pending results, returning what we have 29946 1726882599.19064: results queue empty 29946 1726882599.19065: checking for any_errors_fatal 29946 1726882599.19071: done checking for any_errors_fatal 29946 1726882599.19072: checking for max_fail_percentage 29946 1726882599.19074: done checking for max_fail_percentage 29946 1726882599.19074: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.19075: done checking to see if all hosts have failed 29946 1726882599.19076: getting the remaining hosts for this loop 29946 1726882599.19077: done getting the remaining hosts for this loop 29946 1726882599.19081: getting the next task for host managed_node2 29946 1726882599.19087: done getting next task for host managed_node2 29946 1726882599.19090: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882599.19092: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.19106: getting variables 29946 1726882599.19107: in VariableManager get_vars() 29946 1726882599.19135: Calling all_inventory to load vars for managed_node2 29946 1726882599.19138: Calling groups_inventory to load vars for managed_node2 29946 1726882599.19140: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.19148: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.19151: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.19153: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.20491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.22049: done with get_vars() 29946 1726882599.22071: done getting variables 29946 1726882599.22129: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:39 -0400 (0:00:00.867) 0:00:25.331 ****** 29946 1726882599.22159: entering _queue_task() for managed_node2/debug 29946 1726882599.22659: worker is 1 (out of 1 available) 29946 1726882599.22670: exiting _queue_task() for managed_node2/debug 29946 1726882599.22680: done queuing things up, now waiting for results queue to drain 29946 1726882599.22682: waiting for pending results... 29946 1726882599.23274: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882599.23339: in run() - task 12673a56-9f93-95e7-9dfb-000000000072 29946 1726882599.23373: variable 'ansible_search_path' from source: unknown 29946 1726882599.23377: variable 'ansible_search_path' from source: unknown 29946 1726882599.23391: calling self._execute() 29946 1726882599.23634: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.23640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.23643: variable 'omit' from source: magic vars 29946 1726882599.23999: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.24003: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.24024: variable 'omit' from source: magic vars 29946 1726882599.24062: variable 'omit' from source: magic vars 29946 1726882599.24242: variable 'network_provider' from source: set_fact 29946 1726882599.24247: variable 'omit' from source: magic vars 29946 1726882599.24250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882599.24272: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882599.24301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882599.24323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882599.24339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882599.24375: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882599.24383: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.24391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.24502: Set connection var ansible_pipelining to False 29946 1726882599.24515: Set connection var ansible_shell_executable to /bin/sh 29946 1726882599.24526: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882599.24536: Set connection var ansible_timeout to 10 29946 1726882599.24549: Set connection var ansible_shell_type to sh 29946 1726882599.24556: Set connection var ansible_connection to ssh 29946 1726882599.24676: variable 'ansible_shell_executable' from source: unknown 29946 1726882599.24680: variable 'ansible_connection' from source: unknown 29946 1726882599.24682: variable 'ansible_module_compression' from source: unknown 29946 1726882599.24684: variable 'ansible_shell_type' from source: unknown 29946 1726882599.24686: variable 'ansible_shell_executable' from source: unknown 29946 1726882599.24688: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.24690: variable 'ansible_pipelining' from source: unknown 29946 1726882599.24692: variable 'ansible_timeout' from source: unknown 29946 1726882599.24696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.24762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882599.24787: variable 'omit' from source: magic vars 29946 1726882599.24801: starting attempt loop 29946 1726882599.24809: running the handler 29946 1726882599.24861: handler run complete 29946 1726882599.24880: attempt loop complete, returning result 29946 1726882599.24887: _execute() done 29946 1726882599.24904: dumping result to json 29946 1726882599.24912: done dumping result, returning 29946 1726882599.24927: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-95e7-9dfb-000000000072] 29946 1726882599.24937: sending task result for task 12673a56-9f93-95e7-9dfb-000000000072 ok: [managed_node2] => {} MSG: Using network provider: nm 29946 1726882599.25182: no more pending results, returning what we have 29946 1726882599.25185: results queue empty 29946 1726882599.25186: checking for any_errors_fatal 29946 1726882599.25199: done checking for any_errors_fatal 29946 1726882599.25200: checking for max_fail_percentage 29946 1726882599.25201: done checking for max_fail_percentage 29946 1726882599.25202: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.25203: done checking to see if all hosts have failed 29946 1726882599.25204: getting the remaining hosts for this loop 29946 1726882599.25205: done getting the remaining hosts for this loop 29946 1726882599.25209: getting the next task for host managed_node2 29946 1726882599.25217: done getting next task for host managed_node2 29946 1726882599.25221: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882599.25224: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.25235: getting variables 29946 1726882599.25237: in VariableManager get_vars() 29946 1726882599.25272: Calling all_inventory to load vars for managed_node2 29946 1726882599.25275: Calling groups_inventory to load vars for managed_node2 29946 1726882599.25278: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.25289: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.25292: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.25500: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.26303: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000072 29946 1726882599.26306: WORKER PROCESS EXITING 29946 1726882599.27292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.28890: done with get_vars() 29946 1726882599.28913: done getting variables 29946 1726882599.28966: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:39 -0400 (0:00:00.068) 0:00:25.399 ****** 29946 1726882599.28996: entering _queue_task() for managed_node2/fail 29946 1726882599.29255: worker is 1 (out of 1 available) 29946 1726882599.29267: exiting _queue_task() for managed_node2/fail 29946 1726882599.29278: done queuing things up, now waiting for results queue to drain 29946 1726882599.29279: waiting for pending results... 29946 1726882599.29540: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882599.29801: in run() - task 12673a56-9f93-95e7-9dfb-000000000073 29946 1726882599.29824: variable 'ansible_search_path' from source: unknown 29946 1726882599.29831: variable 'ansible_search_path' from source: unknown 29946 1726882599.29867: calling self._execute() 29946 1726882599.30099: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.30113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.30129: variable 'omit' from source: magic vars 29946 1726882599.31100: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.31103: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.31168: variable 'network_state' from source: role '' defaults 29946 1726882599.31183: Evaluated conditional (network_state != {}): False 29946 1726882599.31191: when evaluation is False, skipping this task 29946 1726882599.31200: _execute() done 29946 1726882599.31211: dumping result to json 29946 1726882599.31219: done dumping result, returning 29946 1726882599.31230: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-95e7-9dfb-000000000073] 29946 1726882599.31238: sending task result for task 12673a56-9f93-95e7-9dfb-000000000073 29946 1726882599.31621: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000073 29946 1726882599.31624: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882599.31671: no more pending results, returning what we have 29946 1726882599.31675: results queue empty 29946 1726882599.31676: checking for any_errors_fatal 29946 1726882599.31685: done checking for any_errors_fatal 29946 1726882599.31685: checking for max_fail_percentage 29946 1726882599.31687: done checking for max_fail_percentage 29946 1726882599.31688: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.31689: done checking to see if all hosts have failed 29946 1726882599.31690: getting the remaining hosts for this loop 29946 1726882599.31691: done getting the remaining hosts for this loop 29946 1726882599.31697: getting the next task for host managed_node2 29946 1726882599.31703: done getting next task for host managed_node2 29946 1726882599.31707: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882599.31710: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.31725: getting variables 29946 1726882599.31727: in VariableManager get_vars() 29946 1726882599.31764: Calling all_inventory to load vars for managed_node2 29946 1726882599.31766: Calling groups_inventory to load vars for managed_node2 29946 1726882599.31769: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.31781: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.31784: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.31787: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.34771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.36735: done with get_vars() 29946 1726882599.36756: done getting variables 29946 1726882599.37030: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:39 -0400 (0:00:00.080) 0:00:25.480 ****** 29946 1726882599.37059: entering _queue_task() for managed_node2/fail 29946 1726882599.37361: worker is 1 (out of 1 available) 29946 1726882599.37375: exiting _queue_task() for managed_node2/fail 29946 1726882599.37386: done queuing things up, now waiting for results queue to drain 29946 1726882599.37388: waiting for pending results... 29946 1726882599.38009: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882599.38154: in run() - task 12673a56-9f93-95e7-9dfb-000000000074 29946 1726882599.38175: variable 'ansible_search_path' from source: unknown 29946 1726882599.38443: variable 'ansible_search_path' from source: unknown 29946 1726882599.38447: calling self._execute() 29946 1726882599.38604: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.38616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.38629: variable 'omit' from source: magic vars 29946 1726882599.38982: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.39004: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.39131: variable 'network_state' from source: role '' defaults 29946 1726882599.39146: Evaluated conditional (network_state != {}): False 29946 1726882599.39153: when evaluation is False, skipping this task 29946 1726882599.39160: _execute() done 29946 1726882599.39167: dumping result to json 29946 1726882599.39174: done dumping result, returning 29946 1726882599.39185: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-95e7-9dfb-000000000074] 29946 1726882599.39197: sending task result for task 12673a56-9f93-95e7-9dfb-000000000074 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882599.39364: no more pending results, returning what we have 29946 1726882599.39367: results queue empty 29946 1726882599.39368: checking for any_errors_fatal 29946 1726882599.39376: done checking for any_errors_fatal 29946 1726882599.39376: checking for max_fail_percentage 29946 1726882599.39378: done checking for max_fail_percentage 29946 1726882599.39379: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.39380: done checking to see if all hosts have failed 29946 1726882599.39380: getting the remaining hosts for this loop 29946 1726882599.39382: done getting the remaining hosts for this loop 29946 1726882599.39386: getting the next task for host managed_node2 29946 1726882599.39391: done getting next task for host managed_node2 29946 1726882599.39396: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882599.39399: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.39415: getting variables 29946 1726882599.39417: in VariableManager get_vars() 29946 1726882599.39454: Calling all_inventory to load vars for managed_node2 29946 1726882599.39456: Calling groups_inventory to load vars for managed_node2 29946 1726882599.39459: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.39471: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.39474: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.39477: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.40200: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000074 29946 1726882599.40204: WORKER PROCESS EXITING 29946 1726882599.42307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.44274: done with get_vars() 29946 1726882599.44299: done getting variables 29946 1726882599.44351: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:39 -0400 (0:00:00.073) 0:00:25.553 ****** 29946 1726882599.44387: entering _queue_task() for managed_node2/fail 29946 1726882599.44835: worker is 1 (out of 1 available) 29946 1726882599.44847: exiting _queue_task() for managed_node2/fail 29946 1726882599.44857: done queuing things up, now waiting for results queue to drain 29946 1726882599.44859: waiting for pending results... 29946 1726882599.45048: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882599.45163: in run() - task 12673a56-9f93-95e7-9dfb-000000000075 29946 1726882599.45185: variable 'ansible_search_path' from source: unknown 29946 1726882599.45204: variable 'ansible_search_path' from source: unknown 29946 1726882599.45245: calling self._execute() 29946 1726882599.45353: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.45364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.45378: variable 'omit' from source: magic vars 29946 1726882599.45764: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.45782: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.45966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882599.49391: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882599.49471: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882599.49523: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882599.49566: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882599.49599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882599.49743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.49747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.49749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.49985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.50007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.50298: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.50301: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29946 1726882599.50600: variable 'ansible_distribution' from source: facts 29946 1726882599.50606: variable '__network_rh_distros' from source: role '' defaults 29946 1726882599.50608: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29946 1726882599.51037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.51126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.51401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.51499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.51504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.51507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.51509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.51540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.51584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.51642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.51740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.51770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.51804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.51853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.51875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.52225: variable 'network_connections' from source: play vars 29946 1726882599.52243: variable 'profile' from source: play vars 29946 1726882599.52330: variable 'profile' from source: play vars 29946 1726882599.52340: variable 'interface' from source: set_fact 29946 1726882599.52411: variable 'interface' from source: set_fact 29946 1726882599.52426: variable 'network_state' from source: role '' defaults 29946 1726882599.52483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882599.52669: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882599.52723: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882599.52759: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882599.52794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882599.52852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882599.52887: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882599.52927: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.52953: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882599.52979: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29946 1726882599.52997: when evaluation is False, skipping this task 29946 1726882599.52999: _execute() done 29946 1726882599.53001: dumping result to json 29946 1726882599.53003: done dumping result, returning 29946 1726882599.53036: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-95e7-9dfb-000000000075] 29946 1726882599.53039: sending task result for task 12673a56-9f93-95e7-9dfb-000000000075 skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29946 1726882599.53188: no more pending results, returning what we have 29946 1726882599.53194: results queue empty 29946 1726882599.53195: checking for any_errors_fatal 29946 1726882599.53202: done checking for any_errors_fatal 29946 1726882599.53202: checking for max_fail_percentage 29946 1726882599.53204: done checking for max_fail_percentage 29946 1726882599.53205: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.53206: done checking to see if all hosts have failed 29946 1726882599.53206: getting the remaining hosts for this loop 29946 1726882599.53208: done getting the remaining hosts for this loop 29946 1726882599.53212: getting the next task for host managed_node2 29946 1726882599.53218: done getting next task for host managed_node2 29946 1726882599.53221: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882599.53223: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.53235: getting variables 29946 1726882599.53237: in VariableManager get_vars() 29946 1726882599.53283: Calling all_inventory to load vars for managed_node2 29946 1726882599.53286: Calling groups_inventory to load vars for managed_node2 29946 1726882599.53288: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.53302: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.53305: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.53308: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.54701: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000075 29946 1726882599.54705: WORKER PROCESS EXITING 29946 1726882599.56107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.59104: done with get_vars() 29946 1726882599.59137: done getting variables 29946 1726882599.59304: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:39 -0400 (0:00:00.149) 0:00:25.702 ****** 29946 1726882599.59336: entering _queue_task() for managed_node2/dnf 29946 1726882599.59905: worker is 1 (out of 1 available) 29946 1726882599.59919: exiting _queue_task() for managed_node2/dnf 29946 1726882599.59929: done queuing things up, now waiting for results queue to drain 29946 1726882599.59931: waiting for pending results... 29946 1726882599.60138: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882599.60271: in run() - task 12673a56-9f93-95e7-9dfb-000000000076 29946 1726882599.60302: variable 'ansible_search_path' from source: unknown 29946 1726882599.60311: variable 'ansible_search_path' from source: unknown 29946 1726882599.60458: calling self._execute() 29946 1726882599.60572: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.60585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.60607: variable 'omit' from source: magic vars 29946 1726882599.61043: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.61047: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.61259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882599.64422: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882599.64501: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882599.64553: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882599.64594: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882599.64627: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882599.64770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.64773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.64791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.64842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.64862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.64996: variable 'ansible_distribution' from source: facts 29946 1726882599.65007: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.65027: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29946 1726882599.65202: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882599.65280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.65317: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.65347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.65391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.65421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.65466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.65497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.65535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.65637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.65640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.65644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.65673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.65703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.65752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.65773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.66498: variable 'network_connections' from source: play vars 29946 1726882599.66501: variable 'profile' from source: play vars 29946 1726882599.66505: variable 'profile' from source: play vars 29946 1726882599.66507: variable 'interface' from source: set_fact 29946 1726882599.66510: variable 'interface' from source: set_fact 29946 1726882599.66541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882599.66733: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882599.66752: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882599.66780: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882599.66815: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882599.66865: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882599.66950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882599.66959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.66961: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882599.66997: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882599.67386: variable 'network_connections' from source: play vars 29946 1726882599.67390: variable 'profile' from source: play vars 29946 1726882599.67600: variable 'profile' from source: play vars 29946 1726882599.67603: variable 'interface' from source: set_fact 29946 1726882599.67606: variable 'interface' from source: set_fact 29946 1726882599.67798: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882599.67801: when evaluation is False, skipping this task 29946 1726882599.67803: _execute() done 29946 1726882599.67805: dumping result to json 29946 1726882599.67807: done dumping result, returning 29946 1726882599.67809: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000076] 29946 1726882599.67813: sending task result for task 12673a56-9f93-95e7-9dfb-000000000076 29946 1726882599.67887: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000076 29946 1726882599.67890: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882599.67945: no more pending results, returning what we have 29946 1726882599.67949: results queue empty 29946 1726882599.67950: checking for any_errors_fatal 29946 1726882599.67958: done checking for any_errors_fatal 29946 1726882599.67959: checking for max_fail_percentage 29946 1726882599.67961: done checking for max_fail_percentage 29946 1726882599.67962: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.67963: done checking to see if all hosts have failed 29946 1726882599.67964: getting the remaining hosts for this loop 29946 1726882599.67965: done getting the remaining hosts for this loop 29946 1726882599.67969: getting the next task for host managed_node2 29946 1726882599.67976: done getting next task for host managed_node2 29946 1726882599.67980: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882599.67982: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.67999: getting variables 29946 1726882599.68001: in VariableManager get_vars() 29946 1726882599.68042: Calling all_inventory to load vars for managed_node2 29946 1726882599.68045: Calling groups_inventory to load vars for managed_node2 29946 1726882599.68047: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.68059: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.68062: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.68065: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.70962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.72720: done with get_vars() 29946 1726882599.72742: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882599.72816: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:39 -0400 (0:00:00.135) 0:00:25.838 ****** 29946 1726882599.72847: entering _queue_task() for managed_node2/yum 29946 1726882599.73173: worker is 1 (out of 1 available) 29946 1726882599.73183: exiting _queue_task() for managed_node2/yum 29946 1726882599.73596: done queuing things up, now waiting for results queue to drain 29946 1726882599.73598: waiting for pending results... 29946 1726882599.73942: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882599.73947: in run() - task 12673a56-9f93-95e7-9dfb-000000000077 29946 1726882599.73950: variable 'ansible_search_path' from source: unknown 29946 1726882599.73952: variable 'ansible_search_path' from source: unknown 29946 1726882599.74021: calling self._execute() 29946 1726882599.74235: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.74248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.74267: variable 'omit' from source: magic vars 29946 1726882599.75098: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.75102: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.75486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882599.78366: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882599.78498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882599.78519: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882599.78559: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882599.78599: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882599.78757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.78944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.78976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.79028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.79115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.79441: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.79444: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29946 1726882599.79446: when evaluation is False, skipping this task 29946 1726882599.79448: _execute() done 29946 1726882599.79450: dumping result to json 29946 1726882599.79452: done dumping result, returning 29946 1726882599.79454: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000077] 29946 1726882599.79456: sending task result for task 12673a56-9f93-95e7-9dfb-000000000077 29946 1726882599.79528: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000077 29946 1726882599.79531: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29946 1726882599.79595: no more pending results, returning what we have 29946 1726882599.79599: results queue empty 29946 1726882599.79600: checking for any_errors_fatal 29946 1726882599.79610: done checking for any_errors_fatal 29946 1726882599.79611: checking for max_fail_percentage 29946 1726882599.79614: done checking for max_fail_percentage 29946 1726882599.79615: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.79616: done checking to see if all hosts have failed 29946 1726882599.79617: getting the remaining hosts for this loop 29946 1726882599.79618: done getting the remaining hosts for this loop 29946 1726882599.79622: getting the next task for host managed_node2 29946 1726882599.79629: done getting next task for host managed_node2 29946 1726882599.79633: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882599.79635: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.79649: getting variables 29946 1726882599.79651: in VariableManager get_vars() 29946 1726882599.79690: Calling all_inventory to load vars for managed_node2 29946 1726882599.79694: Calling groups_inventory to load vars for managed_node2 29946 1726882599.79697: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.79708: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.79711: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.79714: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.82696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.84333: done with get_vars() 29946 1726882599.84358: done getting variables 29946 1726882599.84426: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:39 -0400 (0:00:00.116) 0:00:25.954 ****** 29946 1726882599.84458: entering _queue_task() for managed_node2/fail 29946 1726882599.84920: worker is 1 (out of 1 available) 29946 1726882599.84931: exiting _queue_task() for managed_node2/fail 29946 1726882599.84941: done queuing things up, now waiting for results queue to drain 29946 1726882599.84942: waiting for pending results... 29946 1726882599.85139: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882599.85276: in run() - task 12673a56-9f93-95e7-9dfb-000000000078 29946 1726882599.85279: variable 'ansible_search_path' from source: unknown 29946 1726882599.85282: variable 'ansible_search_path' from source: unknown 29946 1726882599.85311: calling self._execute() 29946 1726882599.85412: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.85424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.85451: variable 'omit' from source: magic vars 29946 1726882599.85825: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.85889: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.85975: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882599.86190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882599.88432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882599.88508: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882599.88551: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882599.88590: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882599.88699: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882599.88722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.89086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.89118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.89165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.89185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.89242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.89274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.89302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.89342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.89362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.89416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882599.89483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882599.89486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.89523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882599.89544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882599.89733: variable 'network_connections' from source: play vars 29946 1726882599.89754: variable 'profile' from source: play vars 29946 1726882599.89918: variable 'profile' from source: play vars 29946 1726882599.89922: variable 'interface' from source: set_fact 29946 1726882599.89948: variable 'interface' from source: set_fact 29946 1726882599.90036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882599.90248: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882599.90286: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882599.90355: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882599.90360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882599.90411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882599.90452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882599.90486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882599.90572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882599.90575: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882599.90822: variable 'network_connections' from source: play vars 29946 1726882599.90831: variable 'profile' from source: play vars 29946 1726882599.90887: variable 'profile' from source: play vars 29946 1726882599.90896: variable 'interface' from source: set_fact 29946 1726882599.90962: variable 'interface' from source: set_fact 29946 1726882599.91018: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882599.91020: when evaluation is False, skipping this task 29946 1726882599.91022: _execute() done 29946 1726882599.91024: dumping result to json 29946 1726882599.91026: done dumping result, returning 29946 1726882599.91029: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000078] 29946 1726882599.91098: sending task result for task 12673a56-9f93-95e7-9dfb-000000000078 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882599.91280: no more pending results, returning what we have 29946 1726882599.91284: results queue empty 29946 1726882599.91285: checking for any_errors_fatal 29946 1726882599.91292: done checking for any_errors_fatal 29946 1726882599.91295: checking for max_fail_percentage 29946 1726882599.91297: done checking for max_fail_percentage 29946 1726882599.91298: checking to see if all hosts have failed and the running result is not ok 29946 1726882599.91299: done checking to see if all hosts have failed 29946 1726882599.91299: getting the remaining hosts for this loop 29946 1726882599.91301: done getting the remaining hosts for this loop 29946 1726882599.91305: getting the next task for host managed_node2 29946 1726882599.91312: done getting next task for host managed_node2 29946 1726882599.91317: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29946 1726882599.91319: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882599.91335: getting variables 29946 1726882599.91339: in VariableManager get_vars() 29946 1726882599.91380: Calling all_inventory to load vars for managed_node2 29946 1726882599.91383: Calling groups_inventory to load vars for managed_node2 29946 1726882599.91385: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882599.91611: Calling all_plugins_play to load vars for managed_node2 29946 1726882599.91616: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882599.91620: Calling groups_plugins_play to load vars for managed_node2 29946 1726882599.92253: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000078 29946 1726882599.92256: WORKER PROCESS EXITING 29946 1726882599.93791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882599.95976: done with get_vars() 29946 1726882599.96016: done getting variables 29946 1726882599.96081: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:39 -0400 (0:00:00.116) 0:00:26.070 ****** 29946 1726882599.96118: entering _queue_task() for managed_node2/package 29946 1726882599.96729: worker is 1 (out of 1 available) 29946 1726882599.96740: exiting _queue_task() for managed_node2/package 29946 1726882599.96752: done queuing things up, now waiting for results queue to drain 29946 1726882599.96753: waiting for pending results... 29946 1726882599.97066: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 29946 1726882599.97185: in run() - task 12673a56-9f93-95e7-9dfb-000000000079 29946 1726882599.97215: variable 'ansible_search_path' from source: unknown 29946 1726882599.97224: variable 'ansible_search_path' from source: unknown 29946 1726882599.97264: calling self._execute() 29946 1726882599.97370: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882599.97382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882599.97399: variable 'omit' from source: magic vars 29946 1726882599.97801: variable 'ansible_distribution_major_version' from source: facts 29946 1726882599.97819: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882599.98033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882599.98389: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882599.98394: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882599.98433: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882599.98520: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882599.98639: variable 'network_packages' from source: role '' defaults 29946 1726882599.98762: variable '__network_provider_setup' from source: role '' defaults 29946 1726882599.98785: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882599.98868: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882599.98883: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882599.98950: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882599.99280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882600.02020: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882600.02082: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882600.02122: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882600.02152: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882600.02176: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882600.02462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.02489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.02518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.02556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.02570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.02640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.02662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.02685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.02727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.02760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.02978: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882600.03087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.03111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.03134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.03170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.03183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.03277: variable 'ansible_python' from source: facts 29946 1726882600.03308: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882600.03389: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882600.03470: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882600.03627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.03698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.03702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.03705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.03708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.03751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.03773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.03800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.03840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.03849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.03988: variable 'network_connections' from source: play vars 29946 1726882600.03997: variable 'profile' from source: play vars 29946 1726882600.04169: variable 'profile' from source: play vars 29946 1726882600.04173: variable 'interface' from source: set_fact 29946 1726882600.04175: variable 'interface' from source: set_fact 29946 1726882600.04237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882600.04261: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882600.04292: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.04324: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882600.04367: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882600.04646: variable 'network_connections' from source: play vars 29946 1726882600.04649: variable 'profile' from source: play vars 29946 1726882600.04798: variable 'profile' from source: play vars 29946 1726882600.04802: variable 'interface' from source: set_fact 29946 1726882600.04824: variable 'interface' from source: set_fact 29946 1726882600.04855: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882600.04941: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882600.05253: variable 'network_connections' from source: play vars 29946 1726882600.05256: variable 'profile' from source: play vars 29946 1726882600.05310: variable 'profile' from source: play vars 29946 1726882600.05313: variable 'interface' from source: set_fact 29946 1726882600.05568: variable 'interface' from source: set_fact 29946 1726882600.05571: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882600.05574: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882600.05858: variable 'network_connections' from source: play vars 29946 1726882600.05869: variable 'profile' from source: play vars 29946 1726882600.05963: variable 'profile' from source: play vars 29946 1726882600.05973: variable 'interface' from source: set_fact 29946 1726882600.06108: variable 'interface' from source: set_fact 29946 1726882600.06175: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882600.06247: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882600.06260: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882600.06331: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882600.06562: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882600.07070: variable 'network_connections' from source: play vars 29946 1726882600.07080: variable 'profile' from source: play vars 29946 1726882600.07161: variable 'profile' from source: play vars 29946 1726882600.07170: variable 'interface' from source: set_fact 29946 1726882600.07245: variable 'interface' from source: set_fact 29946 1726882600.07299: variable 'ansible_distribution' from source: facts 29946 1726882600.07302: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.07305: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.07307: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882600.07470: variable 'ansible_distribution' from source: facts 29946 1726882600.07478: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.07486: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.07504: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882600.07666: variable 'ansible_distribution' from source: facts 29946 1726882600.07767: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.07769: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.07771: variable 'network_provider' from source: set_fact 29946 1726882600.07774: variable 'ansible_facts' from source: unknown 29946 1726882600.08475: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29946 1726882600.08484: when evaluation is False, skipping this task 29946 1726882600.08492: _execute() done 29946 1726882600.08502: dumping result to json 29946 1726882600.08521: done dumping result, returning 29946 1726882600.08536: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-95e7-9dfb-000000000079] 29946 1726882600.08553: sending task result for task 12673a56-9f93-95e7-9dfb-000000000079 29946 1726882600.08782: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000079 29946 1726882600.08786: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29946 1726882600.08847: no more pending results, returning what we have 29946 1726882600.08851: results queue empty 29946 1726882600.08852: checking for any_errors_fatal 29946 1726882600.08871: done checking for any_errors_fatal 29946 1726882600.08872: checking for max_fail_percentage 29946 1726882600.08874: done checking for max_fail_percentage 29946 1726882600.08875: checking to see if all hosts have failed and the running result is not ok 29946 1726882600.08876: done checking to see if all hosts have failed 29946 1726882600.08877: getting the remaining hosts for this loop 29946 1726882600.08878: done getting the remaining hosts for this loop 29946 1726882600.08889: getting the next task for host managed_node2 29946 1726882600.08899: done getting next task for host managed_node2 29946 1726882600.08903: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882600.08906: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882600.08921: getting variables 29946 1726882600.08923: in VariableManager get_vars() 29946 1726882600.08961: Calling all_inventory to load vars for managed_node2 29946 1726882600.09080: Calling groups_inventory to load vars for managed_node2 29946 1726882600.09084: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882600.09107: Calling all_plugins_play to load vars for managed_node2 29946 1726882600.09110: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882600.09113: Calling groups_plugins_play to load vars for managed_node2 29946 1726882600.10566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882600.12794: done with get_vars() 29946 1726882600.12818: done getting variables 29946 1726882600.12874: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:40 -0400 (0:00:00.167) 0:00:26.238 ****** 29946 1726882600.12914: entering _queue_task() for managed_node2/package 29946 1726882600.13240: worker is 1 (out of 1 available) 29946 1726882600.13253: exiting _queue_task() for managed_node2/package 29946 1726882600.13264: done queuing things up, now waiting for results queue to drain 29946 1726882600.13266: waiting for pending results... 29946 1726882600.13662: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882600.13666: in run() - task 12673a56-9f93-95e7-9dfb-00000000007a 29946 1726882600.13681: variable 'ansible_search_path' from source: unknown 29946 1726882600.13691: variable 'ansible_search_path' from source: unknown 29946 1726882600.13730: calling self._execute() 29946 1726882600.13835: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.13848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.13866: variable 'omit' from source: magic vars 29946 1726882600.14301: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.14306: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882600.14430: variable 'network_state' from source: role '' defaults 29946 1726882600.14447: Evaluated conditional (network_state != {}): False 29946 1726882600.14454: when evaluation is False, skipping this task 29946 1726882600.14460: _execute() done 29946 1726882600.14467: dumping result to json 29946 1726882600.14473: done dumping result, returning 29946 1726882600.14518: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-95e7-9dfb-00000000007a] 29946 1726882600.14522: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007a skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882600.14725: no more pending results, returning what we have 29946 1726882600.14735: results queue empty 29946 1726882600.14737: checking for any_errors_fatal 29946 1726882600.14746: done checking for any_errors_fatal 29946 1726882600.14747: checking for max_fail_percentage 29946 1726882600.14749: done checking for max_fail_percentage 29946 1726882600.14750: checking to see if all hosts have failed and the running result is not ok 29946 1726882600.14751: done checking to see if all hosts have failed 29946 1726882600.14752: getting the remaining hosts for this loop 29946 1726882600.14754: done getting the remaining hosts for this loop 29946 1726882600.14757: getting the next task for host managed_node2 29946 1726882600.14765: done getting next task for host managed_node2 29946 1726882600.14769: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882600.14772: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882600.14794: getting variables 29946 1726882600.14796: in VariableManager get_vars() 29946 1726882600.14833: Calling all_inventory to load vars for managed_node2 29946 1726882600.14836: Calling groups_inventory to load vars for managed_node2 29946 1726882600.15009: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882600.15021: Calling all_plugins_play to load vars for managed_node2 29946 1726882600.15024: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882600.15028: Calling groups_plugins_play to load vars for managed_node2 29946 1726882600.15706: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007a 29946 1726882600.15709: WORKER PROCESS EXITING 29946 1726882600.16701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882600.18397: done with get_vars() 29946 1726882600.18419: done getting variables 29946 1726882600.18482: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:40 -0400 (0:00:00.056) 0:00:26.294 ****** 29946 1726882600.18518: entering _queue_task() for managed_node2/package 29946 1726882600.19143: worker is 1 (out of 1 available) 29946 1726882600.19154: exiting _queue_task() for managed_node2/package 29946 1726882600.19164: done queuing things up, now waiting for results queue to drain 29946 1726882600.19165: waiting for pending results... 29946 1726882600.19711: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882600.19716: in run() - task 12673a56-9f93-95e7-9dfb-00000000007b 29946 1726882600.19802: variable 'ansible_search_path' from source: unknown 29946 1726882600.19813: variable 'ansible_search_path' from source: unknown 29946 1726882600.19848: calling self._execute() 29946 1726882600.19945: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.20108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.20125: variable 'omit' from source: magic vars 29946 1726882600.20689: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.20999: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882600.21037: variable 'network_state' from source: role '' defaults 29946 1726882600.21112: Evaluated conditional (network_state != {}): False 29946 1726882600.21120: when evaluation is False, skipping this task 29946 1726882600.21126: _execute() done 29946 1726882600.21132: dumping result to json 29946 1726882600.21139: done dumping result, returning 29946 1726882600.21148: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-95e7-9dfb-00000000007b] 29946 1726882600.21157: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007b skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882600.21363: no more pending results, returning what we have 29946 1726882600.21367: results queue empty 29946 1726882600.21368: checking for any_errors_fatal 29946 1726882600.21376: done checking for any_errors_fatal 29946 1726882600.21377: checking for max_fail_percentage 29946 1726882600.21379: done checking for max_fail_percentage 29946 1726882600.21380: checking to see if all hosts have failed and the running result is not ok 29946 1726882600.21381: done checking to see if all hosts have failed 29946 1726882600.21382: getting the remaining hosts for this loop 29946 1726882600.21384: done getting the remaining hosts for this loop 29946 1726882600.21391: getting the next task for host managed_node2 29946 1726882600.21400: done getting next task for host managed_node2 29946 1726882600.21405: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882600.21407: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882600.21424: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007b 29946 1726882600.21427: WORKER PROCESS EXITING 29946 1726882600.21436: getting variables 29946 1726882600.21438: in VariableManager get_vars() 29946 1726882600.21474: Calling all_inventory to load vars for managed_node2 29946 1726882600.21477: Calling groups_inventory to load vars for managed_node2 29946 1726882600.21479: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882600.21497: Calling all_plugins_play to load vars for managed_node2 29946 1726882600.21501: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882600.21504: Calling groups_plugins_play to load vars for managed_node2 29946 1726882600.22973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882600.25056: done with get_vars() 29946 1726882600.25078: done getting variables 29946 1726882600.25142: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:40 -0400 (0:00:00.066) 0:00:26.361 ****** 29946 1726882600.25172: entering _queue_task() for managed_node2/service 29946 1726882600.25609: worker is 1 (out of 1 available) 29946 1726882600.25619: exiting _queue_task() for managed_node2/service 29946 1726882600.25627: done queuing things up, now waiting for results queue to drain 29946 1726882600.25629: waiting for pending results... 29946 1726882600.25732: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882600.25841: in run() - task 12673a56-9f93-95e7-9dfb-00000000007c 29946 1726882600.25867: variable 'ansible_search_path' from source: unknown 29946 1726882600.25874: variable 'ansible_search_path' from source: unknown 29946 1726882600.25913: calling self._execute() 29946 1726882600.26073: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.26077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.26080: variable 'omit' from source: magic vars 29946 1726882600.26422: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.26442: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882600.26567: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882600.26780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882600.34146: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882600.34198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882600.34226: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882600.34248: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882600.34268: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882600.34321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.34342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.34359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.34384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.34401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.34432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.34451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.34467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.34496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.34508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.34535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.34551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.34568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.34597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.34609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.34717: variable 'network_connections' from source: play vars 29946 1726882600.34725: variable 'profile' from source: play vars 29946 1726882600.34775: variable 'profile' from source: play vars 29946 1726882600.34778: variable 'interface' from source: set_fact 29946 1726882600.34826: variable 'interface' from source: set_fact 29946 1726882600.34873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882600.34973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882600.35013: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882600.35036: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882600.35057: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882600.35086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882600.35107: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882600.35124: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.35141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882600.35173: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882600.35469: variable 'network_connections' from source: play vars 29946 1726882600.35472: variable 'profile' from source: play vars 29946 1726882600.35475: variable 'profile' from source: play vars 29946 1726882600.35477: variable 'interface' from source: set_fact 29946 1726882600.35566: variable 'interface' from source: set_fact 29946 1726882600.35610: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882600.35619: when evaluation is False, skipping this task 29946 1726882600.35625: _execute() done 29946 1726882600.35631: dumping result to json 29946 1726882600.35638: done dumping result, returning 29946 1726882600.35648: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-00000000007c] 29946 1726882600.35662: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007c 29946 1726882600.35852: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007c 29946 1726882600.35855: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882600.35900: no more pending results, returning what we have 29946 1726882600.35903: results queue empty 29946 1726882600.35904: checking for any_errors_fatal 29946 1726882600.35910: done checking for any_errors_fatal 29946 1726882600.35911: checking for max_fail_percentage 29946 1726882600.35913: done checking for max_fail_percentage 29946 1726882600.35914: checking to see if all hosts have failed and the running result is not ok 29946 1726882600.35915: done checking to see if all hosts have failed 29946 1726882600.35916: getting the remaining hosts for this loop 29946 1726882600.35917: done getting the remaining hosts for this loop 29946 1726882600.35921: getting the next task for host managed_node2 29946 1726882600.35927: done getting next task for host managed_node2 29946 1726882600.35931: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882600.35933: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882600.35948: getting variables 29946 1726882600.35950: in VariableManager get_vars() 29946 1726882600.36078: Calling all_inventory to load vars for managed_node2 29946 1726882600.36081: Calling groups_inventory to load vars for managed_node2 29946 1726882600.36083: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882600.36097: Calling all_plugins_play to load vars for managed_node2 29946 1726882600.36100: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882600.36102: Calling groups_plugins_play to load vars for managed_node2 29946 1726882600.40704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882600.41937: done with get_vars() 29946 1726882600.41958: done getting variables 29946 1726882600.42012: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:40 -0400 (0:00:00.168) 0:00:26.529 ****** 29946 1726882600.42039: entering _queue_task() for managed_node2/service 29946 1726882600.42389: worker is 1 (out of 1 available) 29946 1726882600.42403: exiting _queue_task() for managed_node2/service 29946 1726882600.42415: done queuing things up, now waiting for results queue to drain 29946 1726882600.42417: waiting for pending results... 29946 1726882600.42755: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882600.42919: in run() - task 12673a56-9f93-95e7-9dfb-00000000007d 29946 1726882600.42923: variable 'ansible_search_path' from source: unknown 29946 1726882600.42926: variable 'ansible_search_path' from source: unknown 29946 1726882600.42929: calling self._execute() 29946 1726882600.42959: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.42963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.42977: variable 'omit' from source: magic vars 29946 1726882600.43314: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.43323: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882600.43436: variable 'network_provider' from source: set_fact 29946 1726882600.43440: variable 'network_state' from source: role '' defaults 29946 1726882600.43450: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29946 1726882600.43452: variable 'omit' from source: magic vars 29946 1726882600.43481: variable 'omit' from source: magic vars 29946 1726882600.43507: variable 'network_service_name' from source: role '' defaults 29946 1726882600.43556: variable 'network_service_name' from source: role '' defaults 29946 1726882600.43629: variable '__network_provider_setup' from source: role '' defaults 29946 1726882600.43635: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882600.43679: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882600.43687: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882600.43734: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882600.43876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882600.46108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882600.46111: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882600.46120: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882600.46156: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882600.46182: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882600.46262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.46296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.46322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.46352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.46363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.46398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.46422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.46443: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.46469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.46479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.46642: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882600.46720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.46736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.46752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.46781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.46792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.46871: variable 'ansible_python' from source: facts 29946 1726882600.46883: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882600.46940: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882600.46996: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882600.47071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.47096: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.47113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.47137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.47148: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.47179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882600.47206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882600.47223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.47247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882600.47257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882600.47350: variable 'network_connections' from source: play vars 29946 1726882600.47356: variable 'profile' from source: play vars 29946 1726882600.47408: variable 'profile' from source: play vars 29946 1726882600.47412: variable 'interface' from source: set_fact 29946 1726882600.47457: variable 'interface' from source: set_fact 29946 1726882600.47529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882600.47783: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882600.47821: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882600.47851: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882600.47882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882600.47925: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882600.47948: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882600.47982: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882600.48013: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882600.48199: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882600.48283: variable 'network_connections' from source: play vars 29946 1726882600.48289: variable 'profile' from source: play vars 29946 1726882600.48345: variable 'profile' from source: play vars 29946 1726882600.48348: variable 'interface' from source: set_fact 29946 1726882600.48392: variable 'interface' from source: set_fact 29946 1726882600.48416: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882600.48471: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882600.48653: variable 'network_connections' from source: play vars 29946 1726882600.48657: variable 'profile' from source: play vars 29946 1726882600.48707: variable 'profile' from source: play vars 29946 1726882600.48711: variable 'interface' from source: set_fact 29946 1726882600.48762: variable 'interface' from source: set_fact 29946 1726882600.48781: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882600.48835: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882600.49098: variable 'network_connections' from source: play vars 29946 1726882600.49101: variable 'profile' from source: play vars 29946 1726882600.49126: variable 'profile' from source: play vars 29946 1726882600.49131: variable 'interface' from source: set_fact 29946 1726882600.49223: variable 'interface' from source: set_fact 29946 1726882600.49243: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882600.49297: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882600.49303: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882600.49498: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882600.49566: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882600.49892: variable 'network_connections' from source: play vars 29946 1726882600.49898: variable 'profile' from source: play vars 29946 1726882600.49945: variable 'profile' from source: play vars 29946 1726882600.49949: variable 'interface' from source: set_fact 29946 1726882600.49991: variable 'interface' from source: set_fact 29946 1726882600.49996: variable 'ansible_distribution' from source: facts 29946 1726882600.50001: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.50007: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.50019: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882600.50134: variable 'ansible_distribution' from source: facts 29946 1726882600.50138: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.50141: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.50159: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882600.50267: variable 'ansible_distribution' from source: facts 29946 1726882600.50276: variable '__network_rh_distros' from source: role '' defaults 29946 1726882600.50280: variable 'ansible_distribution_major_version' from source: facts 29946 1726882600.50300: variable 'network_provider' from source: set_fact 29946 1726882600.50317: variable 'omit' from source: magic vars 29946 1726882600.50337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882600.50357: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882600.50377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882600.50389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882600.50398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882600.50420: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882600.50423: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.50426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.50495: Set connection var ansible_pipelining to False 29946 1726882600.50501: Set connection var ansible_shell_executable to /bin/sh 29946 1726882600.50506: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882600.50512: Set connection var ansible_timeout to 10 29946 1726882600.50518: Set connection var ansible_shell_type to sh 29946 1726882600.50520: Set connection var ansible_connection to ssh 29946 1726882600.50538: variable 'ansible_shell_executable' from source: unknown 29946 1726882600.50541: variable 'ansible_connection' from source: unknown 29946 1726882600.50543: variable 'ansible_module_compression' from source: unknown 29946 1726882600.50545: variable 'ansible_shell_type' from source: unknown 29946 1726882600.50548: variable 'ansible_shell_executable' from source: unknown 29946 1726882600.50550: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882600.50556: variable 'ansible_pipelining' from source: unknown 29946 1726882600.50558: variable 'ansible_timeout' from source: unknown 29946 1726882600.50560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882600.50633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882600.50641: variable 'omit' from source: magic vars 29946 1726882600.50646: starting attempt loop 29946 1726882600.50649: running the handler 29946 1726882600.50704: variable 'ansible_facts' from source: unknown 29946 1726882600.51159: _low_level_execute_command(): starting 29946 1726882600.51163: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882600.51676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882600.51682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.51684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882600.51686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882600.51688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.51731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882600.51741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882600.51748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.51829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.53503: stdout chunk (state=3): >>>/root <<< 29946 1726882600.53605: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882600.53634: stderr chunk (state=3): >>><<< 29946 1726882600.53637: stdout chunk (state=3): >>><<< 29946 1726882600.53654: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882600.53664: _low_level_execute_command(): starting 29946 1726882600.53669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122 `" && echo ansible-tmp-1726882600.536539-31194-24133573086122="` echo /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122 `" ) && sleep 0' 29946 1726882600.54112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882600.54115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882600.54118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.54120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882600.54122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882600.54124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.54175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882600.54179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882600.54187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.54235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.56121: stdout chunk (state=3): >>>ansible-tmp-1726882600.536539-31194-24133573086122=/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122 <<< 29946 1726882600.56270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882600.56273: stdout chunk (state=3): >>><<< 29946 1726882600.56275: stderr chunk (state=3): >>><<< 29946 1726882600.56288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882600.536539-31194-24133573086122=/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882600.56321: variable 'ansible_module_compression' from source: unknown 29946 1726882600.56379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 29946 1726882600.56597: variable 'ansible_facts' from source: unknown 29946 1726882600.56655: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py 29946 1726882600.56842: Sending initial data 29946 1726882600.56852: Sent initial data (154 bytes) 29946 1726882600.57225: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882600.57238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.57249: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.57298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882600.57311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.57379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.58891: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882600.58902: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882600.58952: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882600.59017: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpm368s69k /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py <<< 29946 1726882600.59023: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py" <<< 29946 1726882600.59078: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpm368s69k" to remote "/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py" <<< 29946 1726882600.59080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py" <<< 29946 1726882600.60233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882600.60267: stderr chunk (state=3): >>><<< 29946 1726882600.60271: stdout chunk (state=3): >>><<< 29946 1726882600.60298: done transferring module to remote 29946 1726882600.60306: _low_level_execute_command(): starting 29946 1726882600.60311: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/ /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py && sleep 0' 29946 1726882600.60726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882600.60729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.60731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882600.60733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.60780: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882600.60783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.60858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.62621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882600.62624: stderr chunk (state=3): >>><<< 29946 1726882600.62627: stdout chunk (state=3): >>><<< 29946 1726882600.62645: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882600.62661: _low_level_execute_command(): starting 29946 1726882600.62664: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/AnsiballZ_systemd.py && sleep 0' 29946 1726882600.63071: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882600.63075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.63085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.63142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882600.63145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882600.63148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.63230: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.91724: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4681728", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314188288", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1536666000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "Coredum<<< 29946 1726882600.91770: stdout chunk (state=3): >>>pReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29946 1726882600.93801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882600.93805: stdout chunk (state=3): >>><<< 29946 1726882600.93808: stderr chunk (state=3): >>><<< 29946 1726882600.93812: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4681728", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3314188288", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1536666000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882600.94101: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882600.94105: _low_level_execute_command(): starting 29946 1726882600.94108: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882600.536539-31194-24133573086122/ > /dev/null 2>&1 && sleep 0' 29946 1726882600.95008: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882600.95021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882600.95041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882600.95127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882600.97041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882600.97051: stdout chunk (state=3): >>><<< 29946 1726882600.97063: stderr chunk (state=3): >>><<< 29946 1726882600.97082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882600.97499: handler run complete 29946 1726882600.97502: attempt loop complete, returning result 29946 1726882600.97505: _execute() done 29946 1726882600.97506: dumping result to json 29946 1726882600.97508: done dumping result, returning 29946 1726882600.97511: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-95e7-9dfb-00000000007d] 29946 1726882600.97513: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007d ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882600.97948: no more pending results, returning what we have 29946 1726882600.97951: results queue empty 29946 1726882600.97951: checking for any_errors_fatal 29946 1726882600.97959: done checking for any_errors_fatal 29946 1726882600.97960: checking for max_fail_percentage 29946 1726882600.97961: done checking for max_fail_percentage 29946 1726882600.97962: checking to see if all hosts have failed and the running result is not ok 29946 1726882600.97963: done checking to see if all hosts have failed 29946 1726882600.97964: getting the remaining hosts for this loop 29946 1726882600.97965: done getting the remaining hosts for this loop 29946 1726882600.97968: getting the next task for host managed_node2 29946 1726882600.97974: done getting next task for host managed_node2 29946 1726882600.97977: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882600.97979: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882600.97990: getting variables 29946 1726882600.97992: in VariableManager get_vars() 29946 1726882600.98032: Calling all_inventory to load vars for managed_node2 29946 1726882600.98035: Calling groups_inventory to load vars for managed_node2 29946 1726882600.98038: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882600.98049: Calling all_plugins_play to load vars for managed_node2 29946 1726882600.98052: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882600.98054: Calling groups_plugins_play to load vars for managed_node2 29946 1726882600.98644: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007d 29946 1726882600.98648: WORKER PROCESS EXITING 29946 1726882601.01577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882601.03937: done with get_vars() 29946 1726882601.04003: done getting variables 29946 1726882601.04070: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:41 -0400 (0:00:00.622) 0:00:27.152 ****** 29946 1726882601.04309: entering _queue_task() for managed_node2/service 29946 1726882601.05040: worker is 1 (out of 1 available) 29946 1726882601.05051: exiting _queue_task() for managed_node2/service 29946 1726882601.05286: done queuing things up, now waiting for results queue to drain 29946 1726882601.05288: waiting for pending results... 29946 1726882601.05672: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882601.05882: in run() - task 12673a56-9f93-95e7-9dfb-00000000007e 29946 1726882601.05955: variable 'ansible_search_path' from source: unknown 29946 1726882601.05965: variable 'ansible_search_path' from source: unknown 29946 1726882601.06011: calling self._execute() 29946 1726882601.06391: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.06407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.06423: variable 'omit' from source: magic vars 29946 1726882601.07356: variable 'ansible_distribution_major_version' from source: facts 29946 1726882601.07360: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882601.07582: variable 'network_provider' from source: set_fact 29946 1726882601.07595: Evaluated conditional (network_provider == "nm"): True 29946 1726882601.07807: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882601.08031: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882601.08364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882601.12147: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882601.12225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882601.12268: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882601.12315: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882601.12346: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882601.12599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882601.12603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882601.12651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882601.12804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882601.12808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882601.12876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882601.12989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882601.13022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882601.13071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882601.13290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882601.13295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882601.13298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882601.13501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882601.13504: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882601.13507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882601.13852: variable 'network_connections' from source: play vars 29946 1726882601.13855: variable 'profile' from source: play vars 29946 1726882601.13931: variable 'profile' from source: play vars 29946 1726882601.13944: variable 'interface' from source: set_fact 29946 1726882601.14014: variable 'interface' from source: set_fact 29946 1726882601.14109: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882601.14285: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882601.14327: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882601.14361: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882601.14401: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882601.14492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882601.14497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882601.14502: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882601.14532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882601.14581: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882601.14850: variable 'network_connections' from source: play vars 29946 1726882601.14860: variable 'profile' from source: play vars 29946 1726882601.14928: variable 'profile' from source: play vars 29946 1726882601.14937: variable 'interface' from source: set_fact 29946 1726882601.14999: variable 'interface' from source: set_fact 29946 1726882601.15143: Evaluated conditional (__network_wpa_supplicant_required): False 29946 1726882601.15147: when evaluation is False, skipping this task 29946 1726882601.15149: _execute() done 29946 1726882601.15159: dumping result to json 29946 1726882601.15161: done dumping result, returning 29946 1726882601.15164: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-95e7-9dfb-00000000007e] 29946 1726882601.15166: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007e 29946 1726882601.15236: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007e 29946 1726882601.15240: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29946 1726882601.15291: no more pending results, returning what we have 29946 1726882601.15296: results queue empty 29946 1726882601.15298: checking for any_errors_fatal 29946 1726882601.15323: done checking for any_errors_fatal 29946 1726882601.15324: checking for max_fail_percentage 29946 1726882601.15325: done checking for max_fail_percentage 29946 1726882601.15327: checking to see if all hosts have failed and the running result is not ok 29946 1726882601.15327: done checking to see if all hosts have failed 29946 1726882601.15328: getting the remaining hosts for this loop 29946 1726882601.15330: done getting the remaining hosts for this loop 29946 1726882601.15334: getting the next task for host managed_node2 29946 1726882601.15340: done getting next task for host managed_node2 29946 1726882601.15344: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882601.15346: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882601.15365: getting variables 29946 1726882601.15367: in VariableManager get_vars() 29946 1726882601.15409: Calling all_inventory to load vars for managed_node2 29946 1726882601.15412: Calling groups_inventory to load vars for managed_node2 29946 1726882601.15414: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882601.15426: Calling all_plugins_play to load vars for managed_node2 29946 1726882601.15429: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882601.15432: Calling groups_plugins_play to load vars for managed_node2 29946 1726882601.17948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882601.20157: done with get_vars() 29946 1726882601.20181: done getting variables 29946 1726882601.20248: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:41 -0400 (0:00:00.159) 0:00:27.312 ****** 29946 1726882601.20281: entering _queue_task() for managed_node2/service 29946 1726882601.20818: worker is 1 (out of 1 available) 29946 1726882601.20828: exiting _queue_task() for managed_node2/service 29946 1726882601.20837: done queuing things up, now waiting for results queue to drain 29946 1726882601.20838: waiting for pending results... 29946 1726882601.20968: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882601.21065: in run() - task 12673a56-9f93-95e7-9dfb-00000000007f 29946 1726882601.21072: variable 'ansible_search_path' from source: unknown 29946 1726882601.21085: variable 'ansible_search_path' from source: unknown 29946 1726882601.21175: calling self._execute() 29946 1726882601.21248: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.21260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.21274: variable 'omit' from source: magic vars 29946 1726882601.21758: variable 'ansible_distribution_major_version' from source: facts 29946 1726882601.21850: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882601.22290: variable 'network_provider' from source: set_fact 29946 1726882601.22295: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882601.22298: when evaluation is False, skipping this task 29946 1726882601.22301: _execute() done 29946 1726882601.22303: dumping result to json 29946 1726882601.22305: done dumping result, returning 29946 1726882601.22308: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-95e7-9dfb-00000000007f] 29946 1726882601.22310: sending task result for task 12673a56-9f93-95e7-9dfb-00000000007f 29946 1726882601.22491: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000007f 29946 1726882601.22499: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882601.22541: no more pending results, returning what we have 29946 1726882601.22546: results queue empty 29946 1726882601.22547: checking for any_errors_fatal 29946 1726882601.22555: done checking for any_errors_fatal 29946 1726882601.22556: checking for max_fail_percentage 29946 1726882601.22558: done checking for max_fail_percentage 29946 1726882601.22559: checking to see if all hosts have failed and the running result is not ok 29946 1726882601.22560: done checking to see if all hosts have failed 29946 1726882601.22560: getting the remaining hosts for this loop 29946 1726882601.22562: done getting the remaining hosts for this loop 29946 1726882601.22565: getting the next task for host managed_node2 29946 1726882601.22572: done getting next task for host managed_node2 29946 1726882601.22576: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882601.22579: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882601.22700: getting variables 29946 1726882601.22703: in VariableManager get_vars() 29946 1726882601.22738: Calling all_inventory to load vars for managed_node2 29946 1726882601.22740: Calling groups_inventory to load vars for managed_node2 29946 1726882601.22743: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882601.22752: Calling all_plugins_play to load vars for managed_node2 29946 1726882601.22755: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882601.22758: Calling groups_plugins_play to load vars for managed_node2 29946 1726882601.25907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882601.29327: done with get_vars() 29946 1726882601.29351: done getting variables 29946 1726882601.29532: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:41 -0400 (0:00:00.092) 0:00:27.405 ****** 29946 1726882601.29564: entering _queue_task() for managed_node2/copy 29946 1726882601.30635: worker is 1 (out of 1 available) 29946 1726882601.30646: exiting _queue_task() for managed_node2/copy 29946 1726882601.30657: done queuing things up, now waiting for results queue to drain 29946 1726882601.30658: waiting for pending results... 29946 1726882601.30969: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882601.31131: in run() - task 12673a56-9f93-95e7-9dfb-000000000080 29946 1726882601.31135: variable 'ansible_search_path' from source: unknown 29946 1726882601.31137: variable 'ansible_search_path' from source: unknown 29946 1726882601.31164: calling self._execute() 29946 1726882601.31276: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.31348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.31351: variable 'omit' from source: magic vars 29946 1726882601.31715: variable 'ansible_distribution_major_version' from source: facts 29946 1726882601.31731: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882601.31855: variable 'network_provider' from source: set_fact 29946 1726882601.31865: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882601.31872: when evaluation is False, skipping this task 29946 1726882601.31879: _execute() done 29946 1726882601.31900: dumping result to json 29946 1726882601.31907: done dumping result, returning 29946 1726882601.32003: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-95e7-9dfb-000000000080] 29946 1726882601.32007: sending task result for task 12673a56-9f93-95e7-9dfb-000000000080 29946 1726882601.32074: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000080 29946 1726882601.32077: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29946 1726882601.32152: no more pending results, returning what we have 29946 1726882601.32156: results queue empty 29946 1726882601.32157: checking for any_errors_fatal 29946 1726882601.32165: done checking for any_errors_fatal 29946 1726882601.32166: checking for max_fail_percentage 29946 1726882601.32168: done checking for max_fail_percentage 29946 1726882601.32169: checking to see if all hosts have failed and the running result is not ok 29946 1726882601.32169: done checking to see if all hosts have failed 29946 1726882601.32170: getting the remaining hosts for this loop 29946 1726882601.32172: done getting the remaining hosts for this loop 29946 1726882601.32175: getting the next task for host managed_node2 29946 1726882601.32182: done getting next task for host managed_node2 29946 1726882601.32188: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882601.32191: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882601.32207: getting variables 29946 1726882601.32238: in VariableManager get_vars() 29946 1726882601.32276: Calling all_inventory to load vars for managed_node2 29946 1726882601.32279: Calling groups_inventory to load vars for managed_node2 29946 1726882601.32281: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882601.32348: Calling all_plugins_play to load vars for managed_node2 29946 1726882601.32352: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882601.32356: Calling groups_plugins_play to load vars for managed_node2 29946 1726882601.35381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882601.39115: done with get_vars() 29946 1726882601.39138: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:41 -0400 (0:00:00.098) 0:00:27.503 ****** 29946 1726882601.39437: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882601.40023: worker is 1 (out of 1 available) 29946 1726882601.40038: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882601.40049: done queuing things up, now waiting for results queue to drain 29946 1726882601.40051: waiting for pending results... 29946 1726882601.40373: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882601.40503: in run() - task 12673a56-9f93-95e7-9dfb-000000000081 29946 1726882601.40523: variable 'ansible_search_path' from source: unknown 29946 1726882601.40531: variable 'ansible_search_path' from source: unknown 29946 1726882601.40598: calling self._execute() 29946 1726882601.40725: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.40798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.40802: variable 'omit' from source: magic vars 29946 1726882601.41234: variable 'ansible_distribution_major_version' from source: facts 29946 1726882601.41258: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882601.41269: variable 'omit' from source: magic vars 29946 1726882601.41317: variable 'omit' from source: magic vars 29946 1726882601.41581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882601.45545: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882601.45811: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882601.45842: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882601.45874: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882601.45902: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882601.45980: variable 'network_provider' from source: set_fact 29946 1726882601.46316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882601.46360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882601.46388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882601.46644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882601.46648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882601.46724: variable 'omit' from source: magic vars 29946 1726882601.47036: variable 'omit' from source: magic vars 29946 1726882601.47138: variable 'network_connections' from source: play vars 29946 1726882601.47150: variable 'profile' from source: play vars 29946 1726882601.47416: variable 'profile' from source: play vars 29946 1726882601.47420: variable 'interface' from source: set_fact 29946 1726882601.47476: variable 'interface' from source: set_fact 29946 1726882601.47814: variable 'omit' from source: magic vars 29946 1726882601.47822: variable '__lsr_ansible_managed' from source: task vars 29946 1726882601.47881: variable '__lsr_ansible_managed' from source: task vars 29946 1726882601.48566: Loaded config def from plugin (lookup/template) 29946 1726882601.48570: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29946 1726882601.48573: File lookup term: get_ansible_managed.j2 29946 1726882601.48575: variable 'ansible_search_path' from source: unknown 29946 1726882601.48577: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29946 1726882601.48706: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29946 1726882601.48740: variable 'ansible_search_path' from source: unknown 29946 1726882601.62238: variable 'ansible_managed' from source: unknown 29946 1726882601.62701: variable 'omit' from source: magic vars 29946 1726882601.62707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882601.62710: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882601.62712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882601.62778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882601.62791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882601.62819: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882601.62828: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.62830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.63102: Set connection var ansible_pipelining to False 29946 1726882601.63105: Set connection var ansible_shell_executable to /bin/sh 29946 1726882601.63108: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882601.63110: Set connection var ansible_timeout to 10 29946 1726882601.63112: Set connection var ansible_shell_type to sh 29946 1726882601.63114: Set connection var ansible_connection to ssh 29946 1726882601.63116: variable 'ansible_shell_executable' from source: unknown 29946 1726882601.63209: variable 'ansible_connection' from source: unknown 29946 1726882601.63212: variable 'ansible_module_compression' from source: unknown 29946 1726882601.63217: variable 'ansible_shell_type' from source: unknown 29946 1726882601.63220: variable 'ansible_shell_executable' from source: unknown 29946 1726882601.63222: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882601.63227: variable 'ansible_pipelining' from source: unknown 29946 1726882601.63229: variable 'ansible_timeout' from source: unknown 29946 1726882601.63233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882601.63460: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882601.63471: variable 'omit' from source: magic vars 29946 1726882601.63476: starting attempt loop 29946 1726882601.63479: running the handler 29946 1726882601.63492: _low_level_execute_command(): starting 29946 1726882601.63648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882601.65100: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882601.65196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882601.65206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882601.65546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882601.67261: stdout chunk (state=3): >>>/root <<< 29946 1726882601.67356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882601.67506: stderr chunk (state=3): >>><<< 29946 1726882601.67509: stdout chunk (state=3): >>><<< 29946 1726882601.67530: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882601.67543: _low_level_execute_command(): starting 29946 1726882601.67554: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381 `" && echo ansible-tmp-1726882601.6753035-31245-83991658240381="` echo /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381 `" ) && sleep 0' 29946 1726882601.68809: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882601.68819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882601.68834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882601.68845: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882601.68972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882601.69030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882601.69057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882601.69106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882601.69141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882601.71120: stdout chunk (state=3): >>>ansible-tmp-1726882601.6753035-31245-83991658240381=/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381 <<< 29946 1726882601.71138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882601.71237: stderr chunk (state=3): >>><<< 29946 1726882601.71240: stdout chunk (state=3): >>><<< 29946 1726882601.71244: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882601.6753035-31245-83991658240381=/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882601.71456: variable 'ansible_module_compression' from source: unknown 29946 1726882601.71459: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 29946 1726882601.71461: variable 'ansible_facts' from source: unknown 29946 1726882601.71665: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py 29946 1726882601.72128: Sending initial data 29946 1726882601.72131: Sent initial data (167 bytes) 29946 1726882601.73122: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882601.73191: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882601.73209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882601.73396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882601.73615: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882601.73788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882601.75476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882601.75619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882601.75699: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpr22wdjco /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py <<< 29946 1726882601.75703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py" <<< 29946 1726882601.75909: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpr22wdjco" to remote "/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py" <<< 29946 1726882601.78669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882601.78970: stderr chunk (state=3): >>><<< 29946 1726882601.78973: stdout chunk (state=3): >>><<< 29946 1726882601.78976: done transferring module to remote 29946 1726882601.78978: _low_level_execute_command(): starting 29946 1726882601.78980: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/ /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py && sleep 0' 29946 1726882601.80676: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882601.80727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882601.80812: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882601.80919: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882601.81149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882601.82878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882601.82990: stdout chunk (state=3): >>><<< 29946 1726882601.82994: stderr chunk (state=3): >>><<< 29946 1726882601.82998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882601.83002: _low_level_execute_command(): starting 29946 1726882601.83005: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/AnsiballZ_network_connections.py && sleep 0' 29946 1726882601.84301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882601.84421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882601.84425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882601.84526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.14534: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29946 1726882602.16397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882602.16435: stderr chunk (state=3): >>><<< 29946 1726882602.16437: stdout chunk (state=3): >>><<< 29946 1726882602.16499: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882602.16509: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882602.16512: _low_level_execute_command(): starting 29946 1726882602.16514: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882601.6753035-31245-83991658240381/ > /dev/null 2>&1 && sleep 0' 29946 1726882602.17016: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.17020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.17045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.17117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.18934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.18961: stderr chunk (state=3): >>><<< 29946 1726882602.18965: stdout chunk (state=3): >>><<< 29946 1726882602.18975: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.18981: handler run complete 29946 1726882602.19004: attempt loop complete, returning result 29946 1726882602.19007: _execute() done 29946 1726882602.19009: dumping result to json 29946 1726882602.19014: done dumping result, returning 29946 1726882602.19022: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-95e7-9dfb-000000000081] 29946 1726882602.19026: sending task result for task 12673a56-9f93-95e7-9dfb-000000000081 29946 1726882602.19120: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000081 29946 1726882602.19123: WORKER PROCESS EXITING changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 29946 1726882602.19207: no more pending results, returning what we have 29946 1726882602.19210: results queue empty 29946 1726882602.19211: checking for any_errors_fatal 29946 1726882602.19218: done checking for any_errors_fatal 29946 1726882602.19219: checking for max_fail_percentage 29946 1726882602.19220: done checking for max_fail_percentage 29946 1726882602.19221: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.19222: done checking to see if all hosts have failed 29946 1726882602.19223: getting the remaining hosts for this loop 29946 1726882602.19224: done getting the remaining hosts for this loop 29946 1726882602.19228: getting the next task for host managed_node2 29946 1726882602.19233: done getting next task for host managed_node2 29946 1726882602.19237: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882602.19239: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.19248: getting variables 29946 1726882602.19250: in VariableManager get_vars() 29946 1726882602.19288: Calling all_inventory to load vars for managed_node2 29946 1726882602.19291: Calling groups_inventory to load vars for managed_node2 29946 1726882602.19295: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.19305: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.19308: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.19311: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.20183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.21500: done with get_vars() 29946 1726882602.21519: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:42 -0400 (0:00:00.821) 0:00:28.325 ****** 29946 1726882602.21579: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882602.21833: worker is 1 (out of 1 available) 29946 1726882602.21847: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882602.21859: done queuing things up, now waiting for results queue to drain 29946 1726882602.21860: waiting for pending results... 29946 1726882602.22042: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882602.22116: in run() - task 12673a56-9f93-95e7-9dfb-000000000082 29946 1726882602.22128: variable 'ansible_search_path' from source: unknown 29946 1726882602.22131: variable 'ansible_search_path' from source: unknown 29946 1726882602.22159: calling self._execute() 29946 1726882602.22238: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.22242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.22252: variable 'omit' from source: magic vars 29946 1726882602.22535: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.22544: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.22628: variable 'network_state' from source: role '' defaults 29946 1726882602.22642: Evaluated conditional (network_state != {}): False 29946 1726882602.22645: when evaluation is False, skipping this task 29946 1726882602.22648: _execute() done 29946 1726882602.22651: dumping result to json 29946 1726882602.22653: done dumping result, returning 29946 1726882602.22656: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-95e7-9dfb-000000000082] 29946 1726882602.22658: sending task result for task 12673a56-9f93-95e7-9dfb-000000000082 29946 1726882602.22742: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000082 29946 1726882602.22745: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882602.22795: no more pending results, returning what we have 29946 1726882602.22799: results queue empty 29946 1726882602.22800: checking for any_errors_fatal 29946 1726882602.22810: done checking for any_errors_fatal 29946 1726882602.22811: checking for max_fail_percentage 29946 1726882602.22813: done checking for max_fail_percentage 29946 1726882602.22814: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.22815: done checking to see if all hosts have failed 29946 1726882602.22815: getting the remaining hosts for this loop 29946 1726882602.22817: done getting the remaining hosts for this loop 29946 1726882602.22820: getting the next task for host managed_node2 29946 1726882602.22825: done getting next task for host managed_node2 29946 1726882602.22829: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882602.22831: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.22846: getting variables 29946 1726882602.22847: in VariableManager get_vars() 29946 1726882602.22878: Calling all_inventory to load vars for managed_node2 29946 1726882602.22880: Calling groups_inventory to load vars for managed_node2 29946 1726882602.22882: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.22900: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.22903: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.22906: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.24059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.25483: done with get_vars() 29946 1726882602.25506: done getting variables 29946 1726882602.25563: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:42 -0400 (0:00:00.040) 0:00:28.365 ****** 29946 1726882602.25586: entering _queue_task() for managed_node2/debug 29946 1726882602.25839: worker is 1 (out of 1 available) 29946 1726882602.25852: exiting _queue_task() for managed_node2/debug 29946 1726882602.25863: done queuing things up, now waiting for results queue to drain 29946 1726882602.25865: waiting for pending results... 29946 1726882602.26047: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882602.26122: in run() - task 12673a56-9f93-95e7-9dfb-000000000083 29946 1726882602.26133: variable 'ansible_search_path' from source: unknown 29946 1726882602.26137: variable 'ansible_search_path' from source: unknown 29946 1726882602.26165: calling self._execute() 29946 1726882602.26245: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.26250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.26259: variable 'omit' from source: magic vars 29946 1726882602.26548: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.26559: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.26564: variable 'omit' from source: magic vars 29946 1726882602.26597: variable 'omit' from source: magic vars 29946 1726882602.26622: variable 'omit' from source: magic vars 29946 1726882602.26656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882602.26683: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882602.26704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882602.26717: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.26728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.26754: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882602.26757: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.26759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.26832: Set connection var ansible_pipelining to False 29946 1726882602.26835: Set connection var ansible_shell_executable to /bin/sh 29946 1726882602.26841: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882602.26848: Set connection var ansible_timeout to 10 29946 1726882602.26858: Set connection var ansible_shell_type to sh 29946 1726882602.26860: Set connection var ansible_connection to ssh 29946 1726882602.26874: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.26877: variable 'ansible_connection' from source: unknown 29946 1726882602.26880: variable 'ansible_module_compression' from source: unknown 29946 1726882602.26882: variable 'ansible_shell_type' from source: unknown 29946 1726882602.26884: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.26886: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.26895: variable 'ansible_pipelining' from source: unknown 29946 1726882602.26897: variable 'ansible_timeout' from source: unknown 29946 1726882602.26899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.27002: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882602.27012: variable 'omit' from source: magic vars 29946 1726882602.27017: starting attempt loop 29946 1726882602.27020: running the handler 29946 1726882602.27114: variable '__network_connections_result' from source: set_fact 29946 1726882602.27172: handler run complete 29946 1726882602.27187: attempt loop complete, returning result 29946 1726882602.27192: _execute() done 29946 1726882602.27198: dumping result to json 29946 1726882602.27201: done dumping result, returning 29946 1726882602.27209: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-95e7-9dfb-000000000083] 29946 1726882602.27215: sending task result for task 12673a56-9f93-95e7-9dfb-000000000083 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 29946 1726882602.27352: no more pending results, returning what we have 29946 1726882602.27355: results queue empty 29946 1726882602.27356: checking for any_errors_fatal 29946 1726882602.27361: done checking for any_errors_fatal 29946 1726882602.27362: checking for max_fail_percentage 29946 1726882602.27363: done checking for max_fail_percentage 29946 1726882602.27364: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.27365: done checking to see if all hosts have failed 29946 1726882602.27365: getting the remaining hosts for this loop 29946 1726882602.27367: done getting the remaining hosts for this loop 29946 1726882602.27370: getting the next task for host managed_node2 29946 1726882602.27375: done getting next task for host managed_node2 29946 1726882602.27379: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882602.27380: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.27390: getting variables 29946 1726882602.27391: in VariableManager get_vars() 29946 1726882602.27427: Calling all_inventory to load vars for managed_node2 29946 1726882602.27430: Calling groups_inventory to load vars for managed_node2 29946 1726882602.27432: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.27442: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.27444: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.27447: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.28266: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000083 29946 1726882602.28270: WORKER PROCESS EXITING 29946 1726882602.28280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.29163: done with get_vars() 29946 1726882602.29178: done getting variables 29946 1726882602.29222: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:42 -0400 (0:00:00.036) 0:00:28.402 ****** 29946 1726882602.29246: entering _queue_task() for managed_node2/debug 29946 1726882602.29474: worker is 1 (out of 1 available) 29946 1726882602.29488: exiting _queue_task() for managed_node2/debug 29946 1726882602.29500: done queuing things up, now waiting for results queue to drain 29946 1726882602.29501: waiting for pending results... 29946 1726882602.29669: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882602.29741: in run() - task 12673a56-9f93-95e7-9dfb-000000000084 29946 1726882602.29754: variable 'ansible_search_path' from source: unknown 29946 1726882602.29757: variable 'ansible_search_path' from source: unknown 29946 1726882602.29785: calling self._execute() 29946 1726882602.29859: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.29863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.29872: variable 'omit' from source: magic vars 29946 1726882602.30149: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.30161: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.30165: variable 'omit' from source: magic vars 29946 1726882602.30198: variable 'omit' from source: magic vars 29946 1726882602.30222: variable 'omit' from source: magic vars 29946 1726882602.30252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882602.30287: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882602.30303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882602.30317: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.30327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.30349: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882602.30352: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.30354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.30430: Set connection var ansible_pipelining to False 29946 1726882602.30434: Set connection var ansible_shell_executable to /bin/sh 29946 1726882602.30438: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882602.30444: Set connection var ansible_timeout to 10 29946 1726882602.30450: Set connection var ansible_shell_type to sh 29946 1726882602.30453: Set connection var ansible_connection to ssh 29946 1726882602.30468: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.30471: variable 'ansible_connection' from source: unknown 29946 1726882602.30474: variable 'ansible_module_compression' from source: unknown 29946 1726882602.30476: variable 'ansible_shell_type' from source: unknown 29946 1726882602.30479: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.30481: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.30485: variable 'ansible_pipelining' from source: unknown 29946 1726882602.30490: variable 'ansible_timeout' from source: unknown 29946 1726882602.30499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.30588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882602.30798: variable 'omit' from source: magic vars 29946 1726882602.30802: starting attempt loop 29946 1726882602.30804: running the handler 29946 1726882602.30806: variable '__network_connections_result' from source: set_fact 29946 1726882602.30808: variable '__network_connections_result' from source: set_fact 29946 1726882602.30851: handler run complete 29946 1726882602.30879: attempt loop complete, returning result 29946 1726882602.30887: _execute() done 29946 1726882602.30896: dumping result to json 29946 1726882602.30906: done dumping result, returning 29946 1726882602.30943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-95e7-9dfb-000000000084] 29946 1726882602.30954: sending task result for task 12673a56-9f93-95e7-9dfb-000000000084 ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 29946 1726882602.31205: no more pending results, returning what we have 29946 1726882602.31209: results queue empty 29946 1726882602.31210: checking for any_errors_fatal 29946 1726882602.31218: done checking for any_errors_fatal 29946 1726882602.31219: checking for max_fail_percentage 29946 1726882602.31221: done checking for max_fail_percentage 29946 1726882602.31222: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.31223: done checking to see if all hosts have failed 29946 1726882602.31224: getting the remaining hosts for this loop 29946 1726882602.31225: done getting the remaining hosts for this loop 29946 1726882602.31229: getting the next task for host managed_node2 29946 1726882602.31235: done getting next task for host managed_node2 29946 1726882602.31240: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882602.31242: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.31254: getting variables 29946 1726882602.31256: in VariableManager get_vars() 29946 1726882602.31292: Calling all_inventory to load vars for managed_node2 29946 1726882602.31399: Calling groups_inventory to load vars for managed_node2 29946 1726882602.31409: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.31504: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.31508: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.31519: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.32138: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000084 29946 1726882602.32142: WORKER PROCESS EXITING 29946 1726882602.33626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.35556: done with get_vars() 29946 1726882602.35576: done getting variables 29946 1726882602.35624: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:42 -0400 (0:00:00.063) 0:00:28.466 ****** 29946 1726882602.35652: entering _queue_task() for managed_node2/debug 29946 1726882602.35899: worker is 1 (out of 1 available) 29946 1726882602.36202: exiting _queue_task() for managed_node2/debug 29946 1726882602.36212: done queuing things up, now waiting for results queue to drain 29946 1726882602.36214: waiting for pending results... 29946 1726882602.36381: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882602.36480: in run() - task 12673a56-9f93-95e7-9dfb-000000000085 29946 1726882602.36485: variable 'ansible_search_path' from source: unknown 29946 1726882602.36487: variable 'ansible_search_path' from source: unknown 29946 1726882602.36646: calling self._execute() 29946 1726882602.36755: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.36772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.36785: variable 'omit' from source: magic vars 29946 1726882602.37181: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.37203: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.37342: variable 'network_state' from source: role '' defaults 29946 1726882602.37357: Evaluated conditional (network_state != {}): False 29946 1726882602.37368: when evaluation is False, skipping this task 29946 1726882602.37376: _execute() done 29946 1726882602.37383: dumping result to json 29946 1726882602.37390: done dumping result, returning 29946 1726882602.37404: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-95e7-9dfb-000000000085] 29946 1726882602.37417: sending task result for task 12673a56-9f93-95e7-9dfb-000000000085 skipping: [managed_node2] => { "false_condition": "network_state != {}" } 29946 1726882602.37633: no more pending results, returning what we have 29946 1726882602.37637: results queue empty 29946 1726882602.37638: checking for any_errors_fatal 29946 1726882602.37649: done checking for any_errors_fatal 29946 1726882602.37650: checking for max_fail_percentage 29946 1726882602.37652: done checking for max_fail_percentage 29946 1726882602.37653: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.37654: done checking to see if all hosts have failed 29946 1726882602.37655: getting the remaining hosts for this loop 29946 1726882602.37656: done getting the remaining hosts for this loop 29946 1726882602.37660: getting the next task for host managed_node2 29946 1726882602.37666: done getting next task for host managed_node2 29946 1726882602.37669: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882602.37672: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.37688: getting variables 29946 1726882602.37801: in VariableManager get_vars() 29946 1726882602.37842: Calling all_inventory to load vars for managed_node2 29946 1726882602.37845: Calling groups_inventory to load vars for managed_node2 29946 1726882602.37847: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.37868: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000085 29946 1726882602.37871: WORKER PROCESS EXITING 29946 1726882602.37882: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.37886: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.37889: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.39705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.41930: done with get_vars() 29946 1726882602.41958: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:42 -0400 (0:00:00.064) 0:00:28.530 ****** 29946 1726882602.42109: entering _queue_task() for managed_node2/ping 29946 1726882602.42726: worker is 1 (out of 1 available) 29946 1726882602.42740: exiting _queue_task() for managed_node2/ping 29946 1726882602.42752: done queuing things up, now waiting for results queue to drain 29946 1726882602.42754: waiting for pending results... 29946 1726882602.43154: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882602.43235: in run() - task 12673a56-9f93-95e7-9dfb-000000000086 29946 1726882602.43260: variable 'ansible_search_path' from source: unknown 29946 1726882602.43321: variable 'ansible_search_path' from source: unknown 29946 1726882602.43366: calling self._execute() 29946 1726882602.43600: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.43640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.43735: variable 'omit' from source: magic vars 29946 1726882602.44139: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.44157: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.44176: variable 'omit' from source: magic vars 29946 1726882602.44602: variable 'omit' from source: magic vars 29946 1726882602.44605: variable 'omit' from source: magic vars 29946 1726882602.44607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882602.44610: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882602.44612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882602.44613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.44615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.44617: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882602.44619: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.44712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.44917: Set connection var ansible_pipelining to False 29946 1726882602.44998: Set connection var ansible_shell_executable to /bin/sh 29946 1726882602.45001: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882602.45003: Set connection var ansible_timeout to 10 29946 1726882602.45005: Set connection var ansible_shell_type to sh 29946 1726882602.45007: Set connection var ansible_connection to ssh 29946 1726882602.45009: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.45010: variable 'ansible_connection' from source: unknown 29946 1726882602.45013: variable 'ansible_module_compression' from source: unknown 29946 1726882602.45014: variable 'ansible_shell_type' from source: unknown 29946 1726882602.45016: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.45018: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.45020: variable 'ansible_pipelining' from source: unknown 29946 1726882602.45022: variable 'ansible_timeout' from source: unknown 29946 1726882602.45023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.45498: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882602.45527: variable 'omit' from source: magic vars 29946 1726882602.45545: starting attempt loop 29946 1726882602.45584: running the handler 29946 1726882602.45606: _low_level_execute_command(): starting 29946 1726882602.45626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882602.46442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.46446: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.46449: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.46475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.46550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.48430: stdout chunk (state=3): >>>/root <<< 29946 1726882602.48433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.48497: stderr chunk (state=3): >>><<< 29946 1726882602.48501: stdout chunk (state=3): >>><<< 29946 1726882602.48624: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.48631: _low_level_execute_command(): starting 29946 1726882602.48634: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671 `" && echo ansible-tmp-1726882602.4853132-31305-61195802804671="` echo /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671 `" ) && sleep 0' 29946 1726882602.49657: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882602.49665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882602.49668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.49671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882602.49713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.50277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.50281: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.52069: stdout chunk (state=3): >>>ansible-tmp-1726882602.4853132-31305-61195802804671=/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671 <<< 29946 1726882602.52153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.52156: stderr chunk (state=3): >>><<< 29946 1726882602.52159: stdout chunk (state=3): >>><<< 29946 1726882602.52180: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882602.4853132-31305-61195802804671=/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.52230: variable 'ansible_module_compression' from source: unknown 29946 1726882602.52268: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 29946 1726882602.52398: variable 'ansible_facts' from source: unknown 29946 1726882602.52403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py 29946 1726882602.52815: Sending initial data 29946 1726882602.52818: Sent initial data (152 bytes) 29946 1726882602.54173: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.54372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.54466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.56022: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882602.56080: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882602.56501: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpqd8uphdq /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py <<< 29946 1726882602.56504: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py" <<< 29946 1726882602.56587: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpqd8uphdq" to remote "/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py" <<< 29946 1726882602.56591: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py" <<< 29946 1726882602.58092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.58164: stderr chunk (state=3): >>><<< 29946 1726882602.58273: stdout chunk (state=3): >>><<< 29946 1726882602.58277: done transferring module to remote 29946 1726882602.58279: _low_level_execute_command(): starting 29946 1726882602.58281: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/ /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py && sleep 0' 29946 1726882602.59410: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882602.59423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.59435: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.59502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.59515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.59708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.59804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.61514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.61552: stderr chunk (state=3): >>><<< 29946 1726882602.61555: stdout chunk (state=3): >>><<< 29946 1726882602.61570: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.61578: _low_level_execute_command(): starting 29946 1726882602.61589: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/AnsiballZ_ping.py && sleep 0' 29946 1726882602.62799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882602.62813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882602.62824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.62875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.62887: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882602.63203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.63305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.78066: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29946 1726882602.79252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882602.79256: stderr chunk (state=3): >>><<< 29946 1726882602.79261: stdout chunk (state=3): >>><<< 29946 1726882602.79279: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882602.79303: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882602.79312: _low_level_execute_command(): starting 29946 1726882602.79317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882602.4853132-31305-61195802804671/ > /dev/null 2>&1 && sleep 0' 29946 1726882602.79743: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882602.79747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.79749: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882602.79751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.79811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.79813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.79868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.81660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.81684: stderr chunk (state=3): >>><<< 29946 1726882602.81687: stdout chunk (state=3): >>><<< 29946 1726882602.81708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.81716: handler run complete 29946 1726882602.81727: attempt loop complete, returning result 29946 1726882602.81730: _execute() done 29946 1726882602.81734: dumping result to json 29946 1726882602.81736: done dumping result, returning 29946 1726882602.81746: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-95e7-9dfb-000000000086] 29946 1726882602.81750: sending task result for task 12673a56-9f93-95e7-9dfb-000000000086 29946 1726882602.81836: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000086 29946 1726882602.81839: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "ping": "pong" } 29946 1726882602.81896: no more pending results, returning what we have 29946 1726882602.81899: results queue empty 29946 1726882602.81900: checking for any_errors_fatal 29946 1726882602.81908: done checking for any_errors_fatal 29946 1726882602.81909: checking for max_fail_percentage 29946 1726882602.81911: done checking for max_fail_percentage 29946 1726882602.81911: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.81912: done checking to see if all hosts have failed 29946 1726882602.81913: getting the remaining hosts for this loop 29946 1726882602.81914: done getting the remaining hosts for this loop 29946 1726882602.81917: getting the next task for host managed_node2 29946 1726882602.81923: done getting next task for host managed_node2 29946 1726882602.81925: ^ task is: TASK: meta (role_complete) 29946 1726882602.81927: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.81936: getting variables 29946 1726882602.81938: in VariableManager get_vars() 29946 1726882602.81976: Calling all_inventory to load vars for managed_node2 29946 1726882602.81979: Calling groups_inventory to load vars for managed_node2 29946 1726882602.81981: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.81990: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.81995: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.81997: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.82854: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.83743: done with get_vars() 29946 1726882602.83760: done getting variables 29946 1726882602.83819: done queuing things up, now waiting for results queue to drain 29946 1726882602.83821: results queue empty 29946 1726882602.83822: checking for any_errors_fatal 29946 1726882602.83823: done checking for any_errors_fatal 29946 1726882602.83824: checking for max_fail_percentage 29946 1726882602.83825: done checking for max_fail_percentage 29946 1726882602.83825: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.83826: done checking to see if all hosts have failed 29946 1726882602.83826: getting the remaining hosts for this loop 29946 1726882602.83827: done getting the remaining hosts for this loop 29946 1726882602.83829: getting the next task for host managed_node2 29946 1726882602.83832: done getting next task for host managed_node2 29946 1726882602.83834: ^ task is: TASK: meta (flush_handlers) 29946 1726882602.83835: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.83837: getting variables 29946 1726882602.83837: in VariableManager get_vars() 29946 1726882602.83846: Calling all_inventory to load vars for managed_node2 29946 1726882602.83847: Calling groups_inventory to load vars for managed_node2 29946 1726882602.83848: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.83852: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.83853: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.83855: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.84601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.85475: done with get_vars() 29946 1726882602.85490: done getting variables 29946 1726882602.85526: in VariableManager get_vars() 29946 1726882602.85535: Calling all_inventory to load vars for managed_node2 29946 1726882602.85536: Calling groups_inventory to load vars for managed_node2 29946 1726882602.85537: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.85540: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.85542: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.85543: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.86197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.87077: done with get_vars() 29946 1726882602.87099: done queuing things up, now waiting for results queue to drain 29946 1726882602.87101: results queue empty 29946 1726882602.87102: checking for any_errors_fatal 29946 1726882602.87103: done checking for any_errors_fatal 29946 1726882602.87103: checking for max_fail_percentage 29946 1726882602.87104: done checking for max_fail_percentage 29946 1726882602.87104: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.87105: done checking to see if all hosts have failed 29946 1726882602.87105: getting the remaining hosts for this loop 29946 1726882602.87106: done getting the remaining hosts for this loop 29946 1726882602.87108: getting the next task for host managed_node2 29946 1726882602.87110: done getting next task for host managed_node2 29946 1726882602.87111: ^ task is: TASK: meta (flush_handlers) 29946 1726882602.87112: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.87114: getting variables 29946 1726882602.87115: in VariableManager get_vars() 29946 1726882602.87122: Calling all_inventory to load vars for managed_node2 29946 1726882602.87123: Calling groups_inventory to load vars for managed_node2 29946 1726882602.87124: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.87128: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.87129: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.87131: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.87826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.89109: done with get_vars() 29946 1726882602.89123: done getting variables 29946 1726882602.89156: in VariableManager get_vars() 29946 1726882602.89164: Calling all_inventory to load vars for managed_node2 29946 1726882602.89166: Calling groups_inventory to load vars for managed_node2 29946 1726882602.89167: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.89170: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.89171: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.89173: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.89813: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.90720: done with get_vars() 29946 1726882602.90746: done queuing things up, now waiting for results queue to drain 29946 1726882602.90748: results queue empty 29946 1726882602.90749: checking for any_errors_fatal 29946 1726882602.90750: done checking for any_errors_fatal 29946 1726882602.90751: checking for max_fail_percentage 29946 1726882602.90752: done checking for max_fail_percentage 29946 1726882602.90752: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.90753: done checking to see if all hosts have failed 29946 1726882602.90754: getting the remaining hosts for this loop 29946 1726882602.90755: done getting the remaining hosts for this loop 29946 1726882602.90757: getting the next task for host managed_node2 29946 1726882602.90760: done getting next task for host managed_node2 29946 1726882602.90761: ^ task is: None 29946 1726882602.90762: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.90763: done queuing things up, now waiting for results queue to drain 29946 1726882602.90764: results queue empty 29946 1726882602.90765: checking for any_errors_fatal 29946 1726882602.90766: done checking for any_errors_fatal 29946 1726882602.90766: checking for max_fail_percentage 29946 1726882602.90767: done checking for max_fail_percentage 29946 1726882602.90768: checking to see if all hosts have failed and the running result is not ok 29946 1726882602.90769: done checking to see if all hosts have failed 29946 1726882602.90770: getting the next task for host managed_node2 29946 1726882602.90772: done getting next task for host managed_node2 29946 1726882602.90773: ^ task is: None 29946 1726882602.90774: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.90819: in VariableManager get_vars() 29946 1726882602.90835: done with get_vars() 29946 1726882602.90841: in VariableManager get_vars() 29946 1726882602.90849: done with get_vars() 29946 1726882602.90854: variable 'omit' from source: magic vars 29946 1726882602.90883: in VariableManager get_vars() 29946 1726882602.90895: done with get_vars() 29946 1726882602.90917: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 29946 1726882602.91156: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29946 1726882602.91179: getting the remaining hosts for this loop 29946 1726882602.91181: done getting the remaining hosts for this loop 29946 1726882602.91183: getting the next task for host managed_node2 29946 1726882602.91186: done getting next task for host managed_node2 29946 1726882602.91188: ^ task is: TASK: Gathering Facts 29946 1726882602.91189: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882602.91191: getting variables 29946 1726882602.91192: in VariableManager get_vars() 29946 1726882602.91202: Calling all_inventory to load vars for managed_node2 29946 1726882602.91204: Calling groups_inventory to load vars for managed_node2 29946 1726882602.91205: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882602.91210: Calling all_plugins_play to load vars for managed_node2 29946 1726882602.91212: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882602.91215: Calling groups_plugins_play to load vars for managed_node2 29946 1726882602.92415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882602.93984: done with get_vars() 29946 1726882602.94014: done getting variables 29946 1726882602.94060: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Friday 20 September 2024 21:36:42 -0400 (0:00:00.519) 0:00:29.050 ****** 29946 1726882602.94087: entering _queue_task() for managed_node2/gather_facts 29946 1726882602.94403: worker is 1 (out of 1 available) 29946 1726882602.94417: exiting _queue_task() for managed_node2/gather_facts 29946 1726882602.94430: done queuing things up, now waiting for results queue to drain 29946 1726882602.94432: waiting for pending results... 29946 1726882602.94610: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882602.94684: in run() - task 12673a56-9f93-95e7-9dfb-00000000057e 29946 1726882602.94701: variable 'ansible_search_path' from source: unknown 29946 1726882602.94799: calling self._execute() 29946 1726882602.94860: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.94881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.94900: variable 'omit' from source: magic vars 29946 1726882602.95303: variable 'ansible_distribution_major_version' from source: facts 29946 1726882602.95322: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882602.95333: variable 'omit' from source: magic vars 29946 1726882602.95365: variable 'omit' from source: magic vars 29946 1726882602.95411: variable 'omit' from source: magic vars 29946 1726882602.95454: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882602.95505: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882602.95532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882602.95577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.95580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882602.95714: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882602.95717: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.95720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.95798: Set connection var ansible_pipelining to False 29946 1726882602.95804: Set connection var ansible_shell_executable to /bin/sh 29946 1726882602.95836: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882602.95839: Set connection var ansible_timeout to 10 29946 1726882602.95841: Set connection var ansible_shell_type to sh 29946 1726882602.95843: Set connection var ansible_connection to ssh 29946 1726882602.95861: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.95882: variable 'ansible_connection' from source: unknown 29946 1726882602.95885: variable 'ansible_module_compression' from source: unknown 29946 1726882602.95890: variable 'ansible_shell_type' from source: unknown 29946 1726882602.95895: variable 'ansible_shell_executable' from source: unknown 29946 1726882602.95898: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882602.95900: variable 'ansible_pipelining' from source: unknown 29946 1726882602.95903: variable 'ansible_timeout' from source: unknown 29946 1726882602.95905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882602.96202: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882602.96206: variable 'omit' from source: magic vars 29946 1726882602.96208: starting attempt loop 29946 1726882602.96211: running the handler 29946 1726882602.96213: variable 'ansible_facts' from source: unknown 29946 1726882602.96215: _low_level_execute_command(): starting 29946 1726882602.96217: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882602.96898: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882602.96917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882602.96930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882602.96967: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882602.96994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882602.97046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882602.98658: stdout chunk (state=3): >>>/root <<< 29946 1726882602.98787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882602.98795: stdout chunk (state=3): >>><<< 29946 1726882602.98801: stderr chunk (state=3): >>><<< 29946 1726882602.98824: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882602.98855: _low_level_execute_command(): starting 29946 1726882602.98858: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700 `" && echo ansible-tmp-1726882602.9883082-31334-101503684867700="` echo /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700 `" ) && sleep 0' 29946 1726882603.00161: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882603.00165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882603.00168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882603.00170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882603.00173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882603.00175: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882603.00185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.00379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882603.00411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882603.00600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882603.02470: stdout chunk (state=3): >>>ansible-tmp-1726882602.9883082-31334-101503684867700=/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700 <<< 29946 1726882603.02501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882603.02550: stderr chunk (state=3): >>><<< 29946 1726882603.02572: stdout chunk (state=3): >>><<< 29946 1726882603.02614: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882602.9883082-31334-101503684867700=/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882603.02710: variable 'ansible_module_compression' from source: unknown 29946 1726882603.02917: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882603.02953: variable 'ansible_facts' from source: unknown 29946 1726882603.03576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py 29946 1726882603.03840: Sending initial data 29946 1726882603.03852: Sent initial data (154 bytes) 29946 1726882603.05264: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.05357: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882603.05480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882603.05590: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882603.07114: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882603.07167: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882603.07279: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmphysquxx0 /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py <<< 29946 1726882603.07282: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py" <<< 29946 1726882603.07359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmphysquxx0" to remote "/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py" <<< 29946 1726882603.10122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882603.10183: stderr chunk (state=3): >>><<< 29946 1726882603.10199: stdout chunk (state=3): >>><<< 29946 1726882603.10228: done transferring module to remote 29946 1726882603.10244: _low_level_execute_command(): starting 29946 1726882603.10254: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/ /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py && sleep 0' 29946 1726882603.11573: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882603.11576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882603.11579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.11581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882603.11583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882603.11587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882603.11589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.11708: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882603.11765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882603.11819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882603.13544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882603.13702: stderr chunk (state=3): >>><<< 29946 1726882603.13705: stdout chunk (state=3): >>><<< 29946 1726882603.13708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882603.13710: _low_level_execute_command(): starting 29946 1726882603.13712: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/AnsiballZ_setup.py && sleep 0' 29946 1726882603.14768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882603.14771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882603.14777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.14779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882603.14781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.14952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882603.15014: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882603.15078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882603.15297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882603.82712: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "43", "epoch": "1726882603", "epoch_int": "1726882603", "date": "2024-09-20", "time": "21:36:43", "iso8601_micro": "2024-09-21T01:36:43.411180Z", "iso8601": "2024-09-21T01:36:43Z", "iso8601_basic": "20240920T213643411180", "iso8601_basic_short": "20240920T213643", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.53662109375, "5m": 0.51171875, "15m": 0.2939453125}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,<<< 29946 1726882603.82773: stdout chunk (state=3): >>>64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 793, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0", "rpltstbr"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f2:06:aa:e8:e0:af", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f006:aaff:fee8:e0af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "aa:8b:67:39:99:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::a88b:67ff:fe39:99d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmenta<<< 29946 1726882603.82804: stdout chunk (state=3): >>>tion": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::f006:aaff:fee8:e0af", "fe80::a88b:67ff:fe39:99d2", "fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b", "fe80::a88b:67ff:fe39:99d2", "fe80::f006:aaff:fee8:e0af"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882603.84900: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882603.84904: stdout chunk (state=3): >>><<< 29946 1726882603.84907: stderr chunk (state=3): >>><<< 29946 1726882603.84911: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "43", "epoch": "1726882603", "epoch_int": "1726882603", "date": "2024-09-20", "time": "21:36:43", "iso8601_micro": "2024-09-21T01:36:43.411180Z", "iso8601": "2024-09-21T01:36:43Z", "iso8601_basic": "20240920T213643411180", "iso8601_basic_short": "20240920T213643", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_fips": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_loadavg": {"1m": 0.53662109375, "5m": 0.51171875, "15m": 0.2939453125}, "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2960, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 571, "free": 2960}, "nocache": {"free": 3300, "used": 231}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 793, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["lo", "eth0", "ethtest0", "peerethtest0", "rpltstbr"], "ansible_ethtest0": {"device": "ethtest0", "macaddress": "f2:06:aa:e8:e0:af", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::f006:aaff:fee8:e0af", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_peerethtest0": {"device": "peerethtest0", "macaddress": "aa:8b:67:39:99:d2", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::a88b:67ff:fe39:99d2", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "off", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "on", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::f006:aaff:fee8:e0af", "fe80::a88b:67ff:fe39:99d2", "fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b", "fe80::a88b:67ff:fe39:99d2", "fe80::f006:aaff:fee8:e0af"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882603.85437: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882603.85470: _low_level_execute_command(): starting 29946 1726882603.85490: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882602.9883082-31334-101503684867700/ > /dev/null 2>&1 && sleep 0' 29946 1726882603.86311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882603.86378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882603.86402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882603.86430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882603.86526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882603.88307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882603.88332: stderr chunk (state=3): >>><<< 29946 1726882603.88334: stdout chunk (state=3): >>><<< 29946 1726882603.88345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882603.88391: handler run complete 29946 1726882603.88467: variable 'ansible_facts' from source: unknown 29946 1726882603.88663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882603.89052: variable 'ansible_facts' from source: unknown 29946 1726882603.89168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882603.89357: attempt loop complete, returning result 29946 1726882603.89360: _execute() done 29946 1726882603.89363: dumping result to json 29946 1726882603.89391: done dumping result, returning 29946 1726882603.89399: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-00000000057e] 29946 1726882603.89403: sending task result for task 12673a56-9f93-95e7-9dfb-00000000057e ok: [managed_node2] 29946 1726882603.90055: no more pending results, returning what we have 29946 1726882603.90058: results queue empty 29946 1726882603.90058: checking for any_errors_fatal 29946 1726882603.90059: done checking for any_errors_fatal 29946 1726882603.90060: checking for max_fail_percentage 29946 1726882603.90061: done checking for max_fail_percentage 29946 1726882603.90061: checking to see if all hosts have failed and the running result is not ok 29946 1726882603.90062: done checking to see if all hosts have failed 29946 1726882603.90062: getting the remaining hosts for this loop 29946 1726882603.90063: done getting the remaining hosts for this loop 29946 1726882603.90065: getting the next task for host managed_node2 29946 1726882603.90069: done getting next task for host managed_node2 29946 1726882603.90070: ^ task is: TASK: meta (flush_handlers) 29946 1726882603.90072: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882603.90075: getting variables 29946 1726882603.90076: in VariableManager get_vars() 29946 1726882603.90097: Calling all_inventory to load vars for managed_node2 29946 1726882603.90099: Calling groups_inventory to load vars for managed_node2 29946 1726882603.90101: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882603.90110: Calling all_plugins_play to load vars for managed_node2 29946 1726882603.90111: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882603.90114: Calling groups_plugins_play to load vars for managed_node2 29946 1726882603.90702: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000057e 29946 1726882603.90706: WORKER PROCESS EXITING 29946 1726882603.90946: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882603.92412: done with get_vars() 29946 1726882603.92435: done getting variables 29946 1726882603.92521: in VariableManager get_vars() 29946 1726882603.92528: Calling all_inventory to load vars for managed_node2 29946 1726882603.92529: Calling groups_inventory to load vars for managed_node2 29946 1726882603.92531: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882603.92534: Calling all_plugins_play to load vars for managed_node2 29946 1726882603.92535: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882603.92537: Calling groups_plugins_play to load vars for managed_node2 29946 1726882603.93195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882603.94078: done with get_vars() 29946 1726882603.94106: done queuing things up, now waiting for results queue to drain 29946 1726882603.94108: results queue empty 29946 1726882603.94109: checking for any_errors_fatal 29946 1726882603.94111: done checking for any_errors_fatal 29946 1726882603.94112: checking for max_fail_percentage 29946 1726882603.94113: done checking for max_fail_percentage 29946 1726882603.94114: checking to see if all hosts have failed and the running result is not ok 29946 1726882603.94115: done checking to see if all hosts have failed 29946 1726882603.94123: getting the remaining hosts for this loop 29946 1726882603.94124: done getting the remaining hosts for this loop 29946 1726882603.94126: getting the next task for host managed_node2 29946 1726882603.94129: done getting next task for host managed_node2 29946 1726882603.94131: ^ task is: TASK: Include the task 'delete_interface.yml' 29946 1726882603.94132: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882603.94133: getting variables 29946 1726882603.94134: in VariableManager get_vars() 29946 1726882603.94142: Calling all_inventory to load vars for managed_node2 29946 1726882603.94144: Calling groups_inventory to load vars for managed_node2 29946 1726882603.94147: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882603.94151: Calling all_plugins_play to load vars for managed_node2 29946 1726882603.94154: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882603.94156: Calling groups_plugins_play to load vars for managed_node2 29946 1726882603.98682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882603.99551: done with get_vars() 29946 1726882603.99566: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Friday 20 September 2024 21:36:43 -0400 (0:00:01.055) 0:00:30.105 ****** 29946 1726882603.99620: entering _queue_task() for managed_node2/include_tasks 29946 1726882603.99912: worker is 1 (out of 1 available) 29946 1726882603.99924: exiting _queue_task() for managed_node2/include_tasks 29946 1726882603.99934: done queuing things up, now waiting for results queue to drain 29946 1726882603.99935: waiting for pending results... 29946 1726882604.00107: running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' 29946 1726882604.00170: in run() - task 12673a56-9f93-95e7-9dfb-000000000089 29946 1726882604.00190: variable 'ansible_search_path' from source: unknown 29946 1726882604.00218: calling self._execute() 29946 1726882604.00296: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.00305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.00313: variable 'omit' from source: magic vars 29946 1726882604.00598: variable 'ansible_distribution_major_version' from source: facts 29946 1726882604.00610: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882604.00616: _execute() done 29946 1726882604.00619: dumping result to json 29946 1726882604.00621: done dumping result, returning 29946 1726882604.00628: done running TaskExecutor() for managed_node2/TASK: Include the task 'delete_interface.yml' [12673a56-9f93-95e7-9dfb-000000000089] 29946 1726882604.00633: sending task result for task 12673a56-9f93-95e7-9dfb-000000000089 29946 1726882604.00717: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000089 29946 1726882604.00720: WORKER PROCESS EXITING 29946 1726882604.00744: no more pending results, returning what we have 29946 1726882604.00749: in VariableManager get_vars() 29946 1726882604.00779: Calling all_inventory to load vars for managed_node2 29946 1726882604.00781: Calling groups_inventory to load vars for managed_node2 29946 1726882604.00784: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.00801: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.00803: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.00806: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.01618: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.02592: done with get_vars() 29946 1726882604.02606: variable 'ansible_search_path' from source: unknown 29946 1726882604.02616: we have included files to process 29946 1726882604.02617: generating all_blocks data 29946 1726882604.02618: done generating all_blocks data 29946 1726882604.02618: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29946 1726882604.02619: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29946 1726882604.02620: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 29946 1726882604.02776: done processing included file 29946 1726882604.02777: iterating over new_blocks loaded from include file 29946 1726882604.02778: in VariableManager get_vars() 29946 1726882604.02788: done with get_vars() 29946 1726882604.02789: filtering new block on tags 29946 1726882604.02801: done filtering new block on tags 29946 1726882604.02803: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed_node2 29946 1726882604.02806: extending task lists for all hosts with included blocks 29946 1726882604.02825: done extending task lists 29946 1726882604.02826: done processing included files 29946 1726882604.02826: results queue empty 29946 1726882604.02827: checking for any_errors_fatal 29946 1726882604.02828: done checking for any_errors_fatal 29946 1726882604.02828: checking for max_fail_percentage 29946 1726882604.02829: done checking for max_fail_percentage 29946 1726882604.02829: checking to see if all hosts have failed and the running result is not ok 29946 1726882604.02830: done checking to see if all hosts have failed 29946 1726882604.02830: getting the remaining hosts for this loop 29946 1726882604.02831: done getting the remaining hosts for this loop 29946 1726882604.02832: getting the next task for host managed_node2 29946 1726882604.02834: done getting next task for host managed_node2 29946 1726882604.02836: ^ task is: TASK: Remove test interface if necessary 29946 1726882604.02837: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.02839: getting variables 29946 1726882604.02839: in VariableManager get_vars() 29946 1726882604.02845: Calling all_inventory to load vars for managed_node2 29946 1726882604.02846: Calling groups_inventory to load vars for managed_node2 29946 1726882604.02848: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.02852: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.02854: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.02856: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.03512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.04368: done with get_vars() 29946 1726882604.04382: done getting variables 29946 1726882604.04417: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Friday 20 September 2024 21:36:44 -0400 (0:00:00.048) 0:00:30.153 ****** 29946 1726882604.04437: entering _queue_task() for managed_node2/command 29946 1726882604.04657: worker is 1 (out of 1 available) 29946 1726882604.04670: exiting _queue_task() for managed_node2/command 29946 1726882604.04682: done queuing things up, now waiting for results queue to drain 29946 1726882604.04683: waiting for pending results... 29946 1726882604.04852: running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary 29946 1726882604.04934: in run() - task 12673a56-9f93-95e7-9dfb-00000000058f 29946 1726882604.04946: variable 'ansible_search_path' from source: unknown 29946 1726882604.04948: variable 'ansible_search_path' from source: unknown 29946 1726882604.04976: calling self._execute() 29946 1726882604.05056: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.05059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.05069: variable 'omit' from source: magic vars 29946 1726882604.05346: variable 'ansible_distribution_major_version' from source: facts 29946 1726882604.05355: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882604.05358: variable 'omit' from source: magic vars 29946 1726882604.05387: variable 'omit' from source: magic vars 29946 1726882604.05453: variable 'interface' from source: set_fact 29946 1726882604.05469: variable 'omit' from source: magic vars 29946 1726882604.05506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882604.05531: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882604.05548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882604.05560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882604.05572: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882604.05599: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882604.05602: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.05606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.05676: Set connection var ansible_pipelining to False 29946 1726882604.05680: Set connection var ansible_shell_executable to /bin/sh 29946 1726882604.05683: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882604.05689: Set connection var ansible_timeout to 10 29946 1726882604.05698: Set connection var ansible_shell_type to sh 29946 1726882604.05701: Set connection var ansible_connection to ssh 29946 1726882604.05718: variable 'ansible_shell_executable' from source: unknown 29946 1726882604.05721: variable 'ansible_connection' from source: unknown 29946 1726882604.05724: variable 'ansible_module_compression' from source: unknown 29946 1726882604.05726: variable 'ansible_shell_type' from source: unknown 29946 1726882604.05729: variable 'ansible_shell_executable' from source: unknown 29946 1726882604.05731: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.05733: variable 'ansible_pipelining' from source: unknown 29946 1726882604.05736: variable 'ansible_timeout' from source: unknown 29946 1726882604.05741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.05840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882604.05848: variable 'omit' from source: magic vars 29946 1726882604.05854: starting attempt loop 29946 1726882604.05856: running the handler 29946 1726882604.05868: _low_level_execute_command(): starting 29946 1726882604.05874: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882604.06365: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.06402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.06406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882604.06408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882604.06411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.06458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.06461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.06464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.06531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.08114: stdout chunk (state=3): >>>/root <<< 29946 1726882604.08214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.08240: stderr chunk (state=3): >>><<< 29946 1726882604.08243: stdout chunk (state=3): >>><<< 29946 1726882604.08265: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.08275: _low_level_execute_command(): starting 29946 1726882604.08280: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777 `" && echo ansible-tmp-1726882604.0826433-31379-83659056552777="` echo /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777 `" ) && sleep 0' 29946 1726882604.08715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.08718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882604.08728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882604.08731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882604.08733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.08778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.08784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.08787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.08846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.10707: stdout chunk (state=3): >>>ansible-tmp-1726882604.0826433-31379-83659056552777=/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777 <<< 29946 1726882604.10815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.10837: stderr chunk (state=3): >>><<< 29946 1726882604.10841: stdout chunk (state=3): >>><<< 29946 1726882604.10855: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882604.0826433-31379-83659056552777=/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.10877: variable 'ansible_module_compression' from source: unknown 29946 1726882604.10922: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882604.10951: variable 'ansible_facts' from source: unknown 29946 1726882604.11006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py 29946 1726882604.11097: Sending initial data 29946 1726882604.11101: Sent initial data (155 bytes) 29946 1726882604.11526: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.11529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882604.11531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.11533: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.11535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.11584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.11594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.11657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.13178: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 29946 1726882604.13182: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882604.13235: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882604.13299: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp0ar5s0bv /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py <<< 29946 1726882604.13305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py" <<< 29946 1726882604.13355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp0ar5s0bv" to remote "/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py" <<< 29946 1726882604.13955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.13991: stderr chunk (state=3): >>><<< 29946 1726882604.13996: stdout chunk (state=3): >>><<< 29946 1726882604.14016: done transferring module to remote 29946 1726882604.14028: _low_level_execute_command(): starting 29946 1726882604.14031: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/ /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py && sleep 0' 29946 1726882604.14448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.14451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882604.14456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882604.14458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.14463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.14512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.14515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.14582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.16276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.16304: stderr chunk (state=3): >>><<< 29946 1726882604.16307: stdout chunk (state=3): >>><<< 29946 1726882604.16319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.16322: _low_level_execute_command(): starting 29946 1726882604.16327: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/AnsiballZ_command.py && sleep 0' 29946 1726882604.16730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.16733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882604.16735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882604.16737: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.16739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.16784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.16795: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.16860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.33069: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:44.317927", "end": "2024-09-20 21:36:44.329517", "delta": "0:00:00.011590", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882604.35312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882604.35338: stderr chunk (state=3): >>><<< 29946 1726882604.35343: stdout chunk (state=3): >>><<< 29946 1726882604.35364: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-20 21:36:44.317927", "end": "2024-09-20 21:36:44.329517", "delta": "0:00:00.011590", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882604.35395: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882604.35402: _low_level_execute_command(): starting 29946 1726882604.35407: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882604.0826433-31379-83659056552777/ > /dev/null 2>&1 && sleep 0' 29946 1726882604.35856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.35861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.35877: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.35935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.35938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.36007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.37832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.37858: stderr chunk (state=3): >>><<< 29946 1726882604.37861: stdout chunk (state=3): >>><<< 29946 1726882604.37877: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.37882: handler run complete 29946 1726882604.37904: Evaluated conditional (False): False 29946 1726882604.37913: attempt loop complete, returning result 29946 1726882604.37915: _execute() done 29946 1726882604.37918: dumping result to json 29946 1726882604.37923: done dumping result, returning 29946 1726882604.37930: done running TaskExecutor() for managed_node2/TASK: Remove test interface if necessary [12673a56-9f93-95e7-9dfb-00000000058f] 29946 1726882604.37938: sending task result for task 12673a56-9f93-95e7-9dfb-00000000058f 29946 1726882604.38037: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000058f 29946 1726882604.38041: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.011590", "end": "2024-09-20 21:36:44.329517", "rc": 0, "start": "2024-09-20 21:36:44.317927" } 29946 1726882604.38111: no more pending results, returning what we have 29946 1726882604.38115: results queue empty 29946 1726882604.38116: checking for any_errors_fatal 29946 1726882604.38117: done checking for any_errors_fatal 29946 1726882604.38118: checking for max_fail_percentage 29946 1726882604.38120: done checking for max_fail_percentage 29946 1726882604.38120: checking to see if all hosts have failed and the running result is not ok 29946 1726882604.38121: done checking to see if all hosts have failed 29946 1726882604.38122: getting the remaining hosts for this loop 29946 1726882604.38123: done getting the remaining hosts for this loop 29946 1726882604.38127: getting the next task for host managed_node2 29946 1726882604.38135: done getting next task for host managed_node2 29946 1726882604.38137: ^ task is: TASK: meta (flush_handlers) 29946 1726882604.38139: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.38144: getting variables 29946 1726882604.38145: in VariableManager get_vars() 29946 1726882604.38184: Calling all_inventory to load vars for managed_node2 29946 1726882604.38189: Calling groups_inventory to load vars for managed_node2 29946 1726882604.38192: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.38206: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.38208: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.38211: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.39841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.41617: done with get_vars() 29946 1726882604.41651: done getting variables 29946 1726882604.41742: in VariableManager get_vars() 29946 1726882604.41754: Calling all_inventory to load vars for managed_node2 29946 1726882604.41756: Calling groups_inventory to load vars for managed_node2 29946 1726882604.41759: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.41773: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.41776: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.41780: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.42902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.43777: done with get_vars() 29946 1726882604.43800: done queuing things up, now waiting for results queue to drain 29946 1726882604.43802: results queue empty 29946 1726882604.43802: checking for any_errors_fatal 29946 1726882604.43805: done checking for any_errors_fatal 29946 1726882604.43805: checking for max_fail_percentage 29946 1726882604.43806: done checking for max_fail_percentage 29946 1726882604.43807: checking to see if all hosts have failed and the running result is not ok 29946 1726882604.43807: done checking to see if all hosts have failed 29946 1726882604.43808: getting the remaining hosts for this loop 29946 1726882604.43808: done getting the remaining hosts for this loop 29946 1726882604.43810: getting the next task for host managed_node2 29946 1726882604.43813: done getting next task for host managed_node2 29946 1726882604.43814: ^ task is: TASK: meta (flush_handlers) 29946 1726882604.43815: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.43817: getting variables 29946 1726882604.43818: in VariableManager get_vars() 29946 1726882604.43823: Calling all_inventory to load vars for managed_node2 29946 1726882604.43825: Calling groups_inventory to load vars for managed_node2 29946 1726882604.43827: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.43832: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.43834: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.43836: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.44535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.45388: done with get_vars() 29946 1726882604.45405: done getting variables 29946 1726882604.45439: in VariableManager get_vars() 29946 1726882604.45450: Calling all_inventory to load vars for managed_node2 29946 1726882604.45453: Calling groups_inventory to load vars for managed_node2 29946 1726882604.45455: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.45458: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.45460: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.45461: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.46102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.46976: done with get_vars() 29946 1726882604.46996: done queuing things up, now waiting for results queue to drain 29946 1726882604.46998: results queue empty 29946 1726882604.46998: checking for any_errors_fatal 29946 1726882604.46999: done checking for any_errors_fatal 29946 1726882604.47000: checking for max_fail_percentage 29946 1726882604.47001: done checking for max_fail_percentage 29946 1726882604.47001: checking to see if all hosts have failed and the running result is not ok 29946 1726882604.47002: done checking to see if all hosts have failed 29946 1726882604.47002: getting the remaining hosts for this loop 29946 1726882604.47003: done getting the remaining hosts for this loop 29946 1726882604.47005: getting the next task for host managed_node2 29946 1726882604.47007: done getting next task for host managed_node2 29946 1726882604.47008: ^ task is: None 29946 1726882604.47009: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.47009: done queuing things up, now waiting for results queue to drain 29946 1726882604.47010: results queue empty 29946 1726882604.47010: checking for any_errors_fatal 29946 1726882604.47011: done checking for any_errors_fatal 29946 1726882604.47011: checking for max_fail_percentage 29946 1726882604.47012: done checking for max_fail_percentage 29946 1726882604.47012: checking to see if all hosts have failed and the running result is not ok 29946 1726882604.47013: done checking to see if all hosts have failed 29946 1726882604.47014: getting the next task for host managed_node2 29946 1726882604.47015: done getting next task for host managed_node2 29946 1726882604.47015: ^ task is: None 29946 1726882604.47016: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.47052: in VariableManager get_vars() 29946 1726882604.47068: done with get_vars() 29946 1726882604.47072: in VariableManager get_vars() 29946 1726882604.47081: done with get_vars() 29946 1726882604.47085: variable 'omit' from source: magic vars 29946 1726882604.47171: variable 'profile' from source: play vars 29946 1726882604.47242: in VariableManager get_vars() 29946 1726882604.47252: done with get_vars() 29946 1726882604.47266: variable 'omit' from source: magic vars 29946 1726882604.47313: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 29946 1726882604.47709: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29946 1726882604.47733: getting the remaining hosts for this loop 29946 1726882604.47735: done getting the remaining hosts for this loop 29946 1726882604.47738: getting the next task for host managed_node2 29946 1726882604.47739: done getting next task for host managed_node2 29946 1726882604.47741: ^ task is: TASK: Gathering Facts 29946 1726882604.47742: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882604.47743: getting variables 29946 1726882604.47744: in VariableManager get_vars() 29946 1726882604.47752: Calling all_inventory to load vars for managed_node2 29946 1726882604.47754: Calling groups_inventory to load vars for managed_node2 29946 1726882604.47755: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882604.47759: Calling all_plugins_play to load vars for managed_node2 29946 1726882604.47760: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882604.47762: Calling groups_plugins_play to load vars for managed_node2 29946 1726882604.48560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882604.49431: done with get_vars() 29946 1726882604.49446: done getting variables 29946 1726882604.49479: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Friday 20 September 2024 21:36:44 -0400 (0:00:00.450) 0:00:30.604 ****** 29946 1726882604.49502: entering _queue_task() for managed_node2/gather_facts 29946 1726882604.49752: worker is 1 (out of 1 available) 29946 1726882604.49765: exiting _queue_task() for managed_node2/gather_facts 29946 1726882604.49776: done queuing things up, now waiting for results queue to drain 29946 1726882604.49778: waiting for pending results... 29946 1726882604.50088: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882604.50095: in run() - task 12673a56-9f93-95e7-9dfb-00000000059d 29946 1726882604.50099: variable 'ansible_search_path' from source: unknown 29946 1726882604.50132: calling self._execute() 29946 1726882604.50265: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.50499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.50502: variable 'omit' from source: magic vars 29946 1726882604.50678: variable 'ansible_distribution_major_version' from source: facts 29946 1726882604.50704: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882604.50717: variable 'omit' from source: magic vars 29946 1726882604.50757: variable 'omit' from source: magic vars 29946 1726882604.50801: variable 'omit' from source: magic vars 29946 1726882604.50857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882604.50902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882604.50918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882604.50932: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882604.50946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882604.50968: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882604.50972: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.50975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.51051: Set connection var ansible_pipelining to False 29946 1726882604.51064: Set connection var ansible_shell_executable to /bin/sh 29946 1726882604.51069: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882604.51075: Set connection var ansible_timeout to 10 29946 1726882604.51081: Set connection var ansible_shell_type to sh 29946 1726882604.51083: Set connection var ansible_connection to ssh 29946 1726882604.51103: variable 'ansible_shell_executable' from source: unknown 29946 1726882604.51106: variable 'ansible_connection' from source: unknown 29946 1726882604.51108: variable 'ansible_module_compression' from source: unknown 29946 1726882604.51111: variable 'ansible_shell_type' from source: unknown 29946 1726882604.51113: variable 'ansible_shell_executable' from source: unknown 29946 1726882604.51115: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882604.51117: variable 'ansible_pipelining' from source: unknown 29946 1726882604.51121: variable 'ansible_timeout' from source: unknown 29946 1726882604.51124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882604.51256: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882604.51267: variable 'omit' from source: magic vars 29946 1726882604.51270: starting attempt loop 29946 1726882604.51274: running the handler 29946 1726882604.51290: variable 'ansible_facts' from source: unknown 29946 1726882604.51306: _low_level_execute_command(): starting 29946 1726882604.51313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882604.51805: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.51810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882604.51812: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.51815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.51867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.51873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.51875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.51939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.53533: stdout chunk (state=3): >>>/root <<< 29946 1726882604.53679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.53683: stdout chunk (state=3): >>><<< 29946 1726882604.53685: stderr chunk (state=3): >>><<< 29946 1726882604.53799: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.53803: _low_level_execute_command(): starting 29946 1726882604.53806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052 `" && echo ansible-tmp-1726882604.5371203-31395-48677425787052="` echo /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052 `" ) && sleep 0' 29946 1726882604.54373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882604.54389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882604.54408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.54426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882604.54499: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882604.54527: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.54578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882604.54597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.54654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.54736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.56587: stdout chunk (state=3): >>>ansible-tmp-1726882604.5371203-31395-48677425787052=/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052 <<< 29946 1726882604.56726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.56757: stderr chunk (state=3): >>><<< 29946 1726882604.56760: stdout chunk (state=3): >>><<< 29946 1726882604.56858: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882604.5371203-31395-48677425787052=/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.56862: variable 'ansible_module_compression' from source: unknown 29946 1726882604.56864: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882604.56925: variable 'ansible_facts' from source: unknown 29946 1726882604.57147: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py 29946 1726882604.57376: Sending initial data 29946 1726882604.57384: Sent initial data (153 bytes) 29946 1726882604.57791: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.57814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.57825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.57863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.57876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.57941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.59453: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882604.59538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882604.59608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpjqdqsacz /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py <<< 29946 1726882604.59611: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py" <<< 29946 1726882604.59681: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpjqdqsacz" to remote "/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py" <<< 29946 1726882604.61129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.61200: stderr chunk (state=3): >>><<< 29946 1726882604.61204: stdout chunk (state=3): >>><<< 29946 1726882604.61206: done transferring module to remote 29946 1726882604.61208: _low_level_execute_command(): starting 29946 1726882604.61210: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/ /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py && sleep 0' 29946 1726882604.61617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.61621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.61623: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.61625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.61678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.61684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.61743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882604.63464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882604.63484: stderr chunk (state=3): >>><<< 29946 1726882604.63488: stdout chunk (state=3): >>><<< 29946 1726882604.63508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882604.63512: _low_level_execute_command(): starting 29946 1726882604.63516: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/AnsiballZ_setup.py && sleep 0' 29946 1726882604.63899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.63918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882604.63921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882604.63965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882604.63977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882604.64048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.27617: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.53662109375, "5m": 0.51171875, "15m": 0.2939453125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 795, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "45", "epoch": "1726882605", "epoch_int": "1726882605", "date": "2024-09-20", "time": "21:36:45", "iso8601_micro": "2024-09-21T01:36:45.227697Z", "iso8601": "2024-09-21T01:36:45Z", "iso8601_basic": "20240920T213645227697", "iso8601_basic_short": "20240920T213645", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback",<<< 29946 1726882605.27630: stdout chunk (state=3): >>> "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882605.29553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882605.29581: stderr chunk (state=3): >>><<< 29946 1726882605.29585: stdout chunk (state=3): >>><<< 29946 1726882605.29625: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_local": {}, "ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.53662109375, "5m": 0.51171875, "15m": 0.2939453125}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2955, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 576, "free": 2955}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 795, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789741056, "block_size": 4096, "block_total": 65519099, "block_available": 63913511, "block_used": 1605588, "inode_total": 131070960, "inode_available": 131029048, "inode_used": 41912, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "45", "epoch": "1726882605", "epoch_int": "1726882605", "date": "2024-09-20", "time": "21:36:45", "iso8601_micro": "2024-09-21T01:36:45.227697Z", "iso8601": "2024-09-21T01:36:45Z", "iso8601_basic": "20240920T213645227697", "iso8601_basic_short": "20240920T213645", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["eth0", "lo", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882605.29877: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882605.29899: _low_level_execute_command(): starting 29946 1726882605.29903: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882604.5371203-31395-48677425787052/ > /dev/null 2>&1 && sleep 0' 29946 1726882605.30362: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882605.30365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882605.30367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882605.30369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882605.30371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882605.30431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882605.30434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.30439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.30504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.32279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882605.32308: stderr chunk (state=3): >>><<< 29946 1726882605.32311: stdout chunk (state=3): >>><<< 29946 1726882605.32323: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882605.32330: handler run complete 29946 1726882605.32417: variable 'ansible_facts' from source: unknown 29946 1726882605.32486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.32678: variable 'ansible_facts' from source: unknown 29946 1726882605.32740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.32819: attempt loop complete, returning result 29946 1726882605.32823: _execute() done 29946 1726882605.32825: dumping result to json 29946 1726882605.32847: done dumping result, returning 29946 1726882605.32859: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-00000000059d] 29946 1726882605.32861: sending task result for task 12673a56-9f93-95e7-9dfb-00000000059d 29946 1726882605.33167: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000059d 29946 1726882605.33170: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882605.33425: no more pending results, returning what we have 29946 1726882605.33427: results queue empty 29946 1726882605.33428: checking for any_errors_fatal 29946 1726882605.33429: done checking for any_errors_fatal 29946 1726882605.33429: checking for max_fail_percentage 29946 1726882605.33430: done checking for max_fail_percentage 29946 1726882605.33431: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.33431: done checking to see if all hosts have failed 29946 1726882605.33432: getting the remaining hosts for this loop 29946 1726882605.33433: done getting the remaining hosts for this loop 29946 1726882605.33435: getting the next task for host managed_node2 29946 1726882605.33439: done getting next task for host managed_node2 29946 1726882605.33440: ^ task is: TASK: meta (flush_handlers) 29946 1726882605.33441: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.33445: getting variables 29946 1726882605.33445: in VariableManager get_vars() 29946 1726882605.33466: Calling all_inventory to load vars for managed_node2 29946 1726882605.33468: Calling groups_inventory to load vars for managed_node2 29946 1726882605.33469: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.33476: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.33478: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.33479: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.34294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.35159: done with get_vars() 29946 1726882605.35174: done getting variables 29946 1726882605.35225: in VariableManager get_vars() 29946 1726882605.35234: Calling all_inventory to load vars for managed_node2 29946 1726882605.35236: Calling groups_inventory to load vars for managed_node2 29946 1726882605.35237: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.35240: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.35241: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.35243: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.35887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.36838: done with get_vars() 29946 1726882605.36856: done queuing things up, now waiting for results queue to drain 29946 1726882605.36858: results queue empty 29946 1726882605.36858: checking for any_errors_fatal 29946 1726882605.36860: done checking for any_errors_fatal 29946 1726882605.36861: checking for max_fail_percentage 29946 1726882605.36866: done checking for max_fail_percentage 29946 1726882605.36867: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.36868: done checking to see if all hosts have failed 29946 1726882605.36868: getting the remaining hosts for this loop 29946 1726882605.36869: done getting the remaining hosts for this loop 29946 1726882605.36871: getting the next task for host managed_node2 29946 1726882605.36873: done getting next task for host managed_node2 29946 1726882605.36875: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882605.36876: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.36883: getting variables 29946 1726882605.36884: in VariableManager get_vars() 29946 1726882605.36896: Calling all_inventory to load vars for managed_node2 29946 1726882605.36897: Calling groups_inventory to load vars for managed_node2 29946 1726882605.36899: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.36902: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.36903: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.36905: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.37542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.38382: done with get_vars() 29946 1726882605.38398: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:36:45 -0400 (0:00:00.889) 0:00:31.494 ****** 29946 1726882605.38450: entering _queue_task() for managed_node2/include_tasks 29946 1726882605.38707: worker is 1 (out of 1 available) 29946 1726882605.38720: exiting _queue_task() for managed_node2/include_tasks 29946 1726882605.38732: done queuing things up, now waiting for results queue to drain 29946 1726882605.38734: waiting for pending results... 29946 1726882605.38913: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 29946 1726882605.38988: in run() - task 12673a56-9f93-95e7-9dfb-000000000091 29946 1726882605.39006: variable 'ansible_search_path' from source: unknown 29946 1726882605.39010: variable 'ansible_search_path' from source: unknown 29946 1726882605.39036: calling self._execute() 29946 1726882605.39114: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.39119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.39129: variable 'omit' from source: magic vars 29946 1726882605.39403: variable 'ansible_distribution_major_version' from source: facts 29946 1726882605.39410: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882605.39416: _execute() done 29946 1726882605.39419: dumping result to json 29946 1726882605.39422: done dumping result, returning 29946 1726882605.39429: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-95e7-9dfb-000000000091] 29946 1726882605.39435: sending task result for task 12673a56-9f93-95e7-9dfb-000000000091 29946 1726882605.39518: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000091 29946 1726882605.39521: WORKER PROCESS EXITING 29946 1726882605.39555: no more pending results, returning what we have 29946 1726882605.39559: in VariableManager get_vars() 29946 1726882605.39602: Calling all_inventory to load vars for managed_node2 29946 1726882605.39604: Calling groups_inventory to load vars for managed_node2 29946 1726882605.39606: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.39619: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.39621: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.39624: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.40513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.41382: done with get_vars() 29946 1726882605.41399: variable 'ansible_search_path' from source: unknown 29946 1726882605.41400: variable 'ansible_search_path' from source: unknown 29946 1726882605.41421: we have included files to process 29946 1726882605.41422: generating all_blocks data 29946 1726882605.41423: done generating all_blocks data 29946 1726882605.41424: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882605.41424: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882605.41426: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 29946 1726882605.41795: done processing included file 29946 1726882605.41796: iterating over new_blocks loaded from include file 29946 1726882605.41797: in VariableManager get_vars() 29946 1726882605.41810: done with get_vars() 29946 1726882605.41811: filtering new block on tags 29946 1726882605.41822: done filtering new block on tags 29946 1726882605.41823: in VariableManager get_vars() 29946 1726882605.41834: done with get_vars() 29946 1726882605.41835: filtering new block on tags 29946 1726882605.41845: done filtering new block on tags 29946 1726882605.41846: in VariableManager get_vars() 29946 1726882605.41858: done with get_vars() 29946 1726882605.41860: filtering new block on tags 29946 1726882605.41868: done filtering new block on tags 29946 1726882605.41869: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node2 29946 1726882605.41873: extending task lists for all hosts with included blocks 29946 1726882605.42073: done extending task lists 29946 1726882605.42074: done processing included files 29946 1726882605.42075: results queue empty 29946 1726882605.42075: checking for any_errors_fatal 29946 1726882605.42076: done checking for any_errors_fatal 29946 1726882605.42077: checking for max_fail_percentage 29946 1726882605.42078: done checking for max_fail_percentage 29946 1726882605.42078: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.42079: done checking to see if all hosts have failed 29946 1726882605.42079: getting the remaining hosts for this loop 29946 1726882605.42080: done getting the remaining hosts for this loop 29946 1726882605.42082: getting the next task for host managed_node2 29946 1726882605.42084: done getting next task for host managed_node2 29946 1726882605.42086: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882605.42088: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.42095: getting variables 29946 1726882605.42096: in VariableManager get_vars() 29946 1726882605.42105: Calling all_inventory to load vars for managed_node2 29946 1726882605.42107: Calling groups_inventory to load vars for managed_node2 29946 1726882605.42108: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.42111: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.42113: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.42114: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.42803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.44319: done with get_vars() 29946 1726882605.44339: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:36:45 -0400 (0:00:00.059) 0:00:31.553 ****** 29946 1726882605.44409: entering _queue_task() for managed_node2/setup 29946 1726882605.44756: worker is 1 (out of 1 available) 29946 1726882605.44769: exiting _queue_task() for managed_node2/setup 29946 1726882605.44781: done queuing things up, now waiting for results queue to drain 29946 1726882605.44783: waiting for pending results... 29946 1726882605.45122: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 29946 1726882605.45395: in run() - task 12673a56-9f93-95e7-9dfb-0000000005de 29946 1726882605.45402: variable 'ansible_search_path' from source: unknown 29946 1726882605.45405: variable 'ansible_search_path' from source: unknown 29946 1726882605.45408: calling self._execute() 29946 1726882605.45411: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.45413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.45416: variable 'omit' from source: magic vars 29946 1726882605.45785: variable 'ansible_distribution_major_version' from source: facts 29946 1726882605.45806: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882605.46028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882605.48406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882605.48479: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882605.48525: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882605.48564: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882605.48606: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882605.48686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882605.48728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882605.48760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882605.48813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882605.48909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882605.48912: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882605.48924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882605.48955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882605.49002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882605.49028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882605.49195: variable '__network_required_facts' from source: role '' defaults 29946 1726882605.49211: variable 'ansible_facts' from source: unknown 29946 1726882605.50112: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 29946 1726882605.50121: when evaluation is False, skipping this task 29946 1726882605.50127: _execute() done 29946 1726882605.50135: dumping result to json 29946 1726882605.50142: done dumping result, returning 29946 1726882605.50154: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-95e7-9dfb-0000000005de] 29946 1726882605.50217: sending task result for task 12673a56-9f93-95e7-9dfb-0000000005de 29946 1726882605.50285: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000005de 29946 1726882605.50288: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882605.50364: no more pending results, returning what we have 29946 1726882605.50368: results queue empty 29946 1726882605.50369: checking for any_errors_fatal 29946 1726882605.50371: done checking for any_errors_fatal 29946 1726882605.50372: checking for max_fail_percentage 29946 1726882605.50373: done checking for max_fail_percentage 29946 1726882605.50374: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.50375: done checking to see if all hosts have failed 29946 1726882605.50376: getting the remaining hosts for this loop 29946 1726882605.50377: done getting the remaining hosts for this loop 29946 1726882605.50381: getting the next task for host managed_node2 29946 1726882605.50391: done getting next task for host managed_node2 29946 1726882605.50396: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882605.50399: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.50414: getting variables 29946 1726882605.50416: in VariableManager get_vars() 29946 1726882605.50456: Calling all_inventory to load vars for managed_node2 29946 1726882605.50459: Calling groups_inventory to load vars for managed_node2 29946 1726882605.50461: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.50472: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.50476: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.50479: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.53521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.55829: done with get_vars() 29946 1726882605.55856: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:36:45 -0400 (0:00:00.115) 0:00:31.669 ****** 29946 1726882605.55954: entering _queue_task() for managed_node2/stat 29946 1726882605.56311: worker is 1 (out of 1 available) 29946 1726882605.56325: exiting _queue_task() for managed_node2/stat 29946 1726882605.56337: done queuing things up, now waiting for results queue to drain 29946 1726882605.56339: waiting for pending results... 29946 1726882605.56612: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 29946 1726882605.56899: in run() - task 12673a56-9f93-95e7-9dfb-0000000005e0 29946 1726882605.56902: variable 'ansible_search_path' from source: unknown 29946 1726882605.56905: variable 'ansible_search_path' from source: unknown 29946 1726882605.56907: calling self._execute() 29946 1726882605.56910: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.56913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.56915: variable 'omit' from source: magic vars 29946 1726882605.57299: variable 'ansible_distribution_major_version' from source: facts 29946 1726882605.57318: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882605.57499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882605.57774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882605.57831: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882605.57867: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882605.57912: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882605.58000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882605.58035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882605.58066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882605.58101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882605.58200: variable '__network_is_ostree' from source: set_fact 29946 1726882605.58212: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882605.58223: when evaluation is False, skipping this task 29946 1726882605.58231: _execute() done 29946 1726882605.58238: dumping result to json 29946 1726882605.58245: done dumping result, returning 29946 1726882605.58255: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-95e7-9dfb-0000000005e0] 29946 1726882605.58263: sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e0 skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882605.58525: no more pending results, returning what we have 29946 1726882605.58529: results queue empty 29946 1726882605.58530: checking for any_errors_fatal 29946 1726882605.58536: done checking for any_errors_fatal 29946 1726882605.58537: checking for max_fail_percentage 29946 1726882605.58539: done checking for max_fail_percentage 29946 1726882605.58540: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.58541: done checking to see if all hosts have failed 29946 1726882605.58541: getting the remaining hosts for this loop 29946 1726882605.58543: done getting the remaining hosts for this loop 29946 1726882605.58549: getting the next task for host managed_node2 29946 1726882605.58555: done getting next task for host managed_node2 29946 1726882605.58559: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882605.58562: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.58576: getting variables 29946 1726882605.58578: in VariableManager get_vars() 29946 1726882605.58621: Calling all_inventory to load vars for managed_node2 29946 1726882605.58624: Calling groups_inventory to load vars for managed_node2 29946 1726882605.58626: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.58637: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.58640: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.58643: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.59210: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e0 29946 1726882605.59214: WORKER PROCESS EXITING 29946 1726882605.60305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.63310: done with get_vars() 29946 1726882605.63334: done getting variables 29946 1726882605.63400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:36:45 -0400 (0:00:00.074) 0:00:31.743 ****** 29946 1726882605.63437: entering _queue_task() for managed_node2/set_fact 29946 1726882605.64204: worker is 1 (out of 1 available) 29946 1726882605.64215: exiting _queue_task() for managed_node2/set_fact 29946 1726882605.64227: done queuing things up, now waiting for results queue to drain 29946 1726882605.64228: waiting for pending results... 29946 1726882605.64748: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 29946 1726882605.64869: in run() - task 12673a56-9f93-95e7-9dfb-0000000005e1 29946 1726882605.64885: variable 'ansible_search_path' from source: unknown 29946 1726882605.64891: variable 'ansible_search_path' from source: unknown 29946 1726882605.65123: calling self._execute() 29946 1726882605.65395: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.65401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.65405: variable 'omit' from source: magic vars 29946 1726882605.65981: variable 'ansible_distribution_major_version' from source: facts 29946 1726882605.65994: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882605.66499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882605.66809: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882605.66862: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882605.67048: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882605.67090: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882605.67179: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882605.67326: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882605.67358: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882605.67389: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882605.67684: variable '__network_is_ostree' from source: set_fact 29946 1726882605.67898: Evaluated conditional (not __network_is_ostree is defined): False 29946 1726882605.67901: when evaluation is False, skipping this task 29946 1726882605.67904: _execute() done 29946 1726882605.67908: dumping result to json 29946 1726882605.67911: done dumping result, returning 29946 1726882605.67917: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-95e7-9dfb-0000000005e1] 29946 1726882605.67920: sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e1 29946 1726882605.67983: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e1 29946 1726882605.67988: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 29946 1726882605.68036: no more pending results, returning what we have 29946 1726882605.68040: results queue empty 29946 1726882605.68041: checking for any_errors_fatal 29946 1726882605.68047: done checking for any_errors_fatal 29946 1726882605.68047: checking for max_fail_percentage 29946 1726882605.68049: done checking for max_fail_percentage 29946 1726882605.68050: checking to see if all hosts have failed and the running result is not ok 29946 1726882605.68051: done checking to see if all hosts have failed 29946 1726882605.68052: getting the remaining hosts for this loop 29946 1726882605.68053: done getting the remaining hosts for this loop 29946 1726882605.68057: getting the next task for host managed_node2 29946 1726882605.68066: done getting next task for host managed_node2 29946 1726882605.68069: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882605.68072: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882605.68088: getting variables 29946 1726882605.68090: in VariableManager get_vars() 29946 1726882605.68244: Calling all_inventory to load vars for managed_node2 29946 1726882605.68247: Calling groups_inventory to load vars for managed_node2 29946 1726882605.68249: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882605.68257: Calling all_plugins_play to load vars for managed_node2 29946 1726882605.68259: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882605.68261: Calling groups_plugins_play to load vars for managed_node2 29946 1726882605.70131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882605.72092: done with get_vars() 29946 1726882605.72119: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:36:45 -0400 (0:00:00.088) 0:00:31.832 ****** 29946 1726882605.72331: entering _queue_task() for managed_node2/service_facts 29946 1726882605.73088: worker is 1 (out of 1 available) 29946 1726882605.73104: exiting _queue_task() for managed_node2/service_facts 29946 1726882605.73117: done queuing things up, now waiting for results queue to drain 29946 1726882605.73119: waiting for pending results... 29946 1726882605.73556: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running 29946 1726882605.73731: in run() - task 12673a56-9f93-95e7-9dfb-0000000005e3 29946 1726882605.73758: variable 'ansible_search_path' from source: unknown 29946 1726882605.73766: variable 'ansible_search_path' from source: unknown 29946 1726882605.73813: calling self._execute() 29946 1726882605.73920: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.73932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.73946: variable 'omit' from source: magic vars 29946 1726882605.74383: variable 'ansible_distribution_major_version' from source: facts 29946 1726882605.74411: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882605.74423: variable 'omit' from source: magic vars 29946 1726882605.74488: variable 'omit' from source: magic vars 29946 1726882605.74534: variable 'omit' from source: magic vars 29946 1726882605.74581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882605.74651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882605.74656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882605.74677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882605.74697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882605.74759: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882605.74762: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.74764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.74865: Set connection var ansible_pipelining to False 29946 1726882605.74878: Set connection var ansible_shell_executable to /bin/sh 29946 1726882605.74891: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882605.74938: Set connection var ansible_timeout to 10 29946 1726882605.74942: Set connection var ansible_shell_type to sh 29946 1726882605.74980: Set connection var ansible_connection to ssh 29946 1726882605.74983: variable 'ansible_shell_executable' from source: unknown 29946 1726882605.74987: variable 'ansible_connection' from source: unknown 29946 1726882605.74995: variable 'ansible_module_compression' from source: unknown 29946 1726882605.75003: variable 'ansible_shell_type' from source: unknown 29946 1726882605.75008: variable 'ansible_shell_executable' from source: unknown 29946 1726882605.75015: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882605.75021: variable 'ansible_pipelining' from source: unknown 29946 1726882605.75059: variable 'ansible_timeout' from source: unknown 29946 1726882605.75062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882605.75394: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882605.75598: variable 'omit' from source: magic vars 29946 1726882605.75603: starting attempt loop 29946 1726882605.75605: running the handler 29946 1726882605.75607: _low_level_execute_command(): starting 29946 1726882605.75609: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882605.76354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882605.76381: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882605.76504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.76738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.76798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.78444: stdout chunk (state=3): >>>/root <<< 29946 1726882605.78571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882605.78599: stdout chunk (state=3): >>><<< 29946 1726882605.78614: stderr chunk (state=3): >>><<< 29946 1726882605.78638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882605.78657: _low_level_execute_command(): starting 29946 1726882605.78744: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554 `" && echo ansible-tmp-1726882605.7864501-31444-144459233602554="` echo /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554 `" ) && sleep 0' 29946 1726882605.79317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882605.79339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882605.79403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882605.79421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882605.79492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882605.79511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.79547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.79814: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.81576: stdout chunk (state=3): >>>ansible-tmp-1726882605.7864501-31444-144459233602554=/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554 <<< 29946 1726882605.81731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882605.81735: stdout chunk (state=3): >>><<< 29946 1726882605.81737: stderr chunk (state=3): >>><<< 29946 1726882605.81758: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882605.7864501-31444-144459233602554=/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882605.81899: variable 'ansible_module_compression' from source: unknown 29946 1726882605.81902: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 29946 1726882605.81944: variable 'ansible_facts' from source: unknown 29946 1726882605.82047: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py 29946 1726882605.82261: Sending initial data 29946 1726882605.82264: Sent initial data (162 bytes) 29946 1726882605.82909: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882605.83003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.83015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.83115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.84781: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882605.84860: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpb04pfa5g /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py <<< 29946 1726882605.84863: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py" <<< 29946 1726882605.85066: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpb04pfa5g" to remote "/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py" <<< 29946 1726882605.86244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882605.86316: stderr chunk (state=3): >>><<< 29946 1726882605.86329: stdout chunk (state=3): >>><<< 29946 1726882605.86356: done transferring module to remote 29946 1726882605.86375: _low_level_execute_command(): starting 29946 1726882605.86385: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/ /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py && sleep 0' 29946 1726882605.87015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882605.87070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882605.87095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.87111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.87213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882605.88973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882605.88985: stdout chunk (state=3): >>><<< 29946 1726882605.89090: stderr chunk (state=3): >>><<< 29946 1726882605.89096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882605.89106: _low_level_execute_command(): starting 29946 1726882605.89109: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/AnsiballZ_service_facts.py && sleep 0' 29946 1726882605.89670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882605.89751: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882605.89774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882605.89880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.42489: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 29946 1726882607.42519: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.<<< 29946 1726882607.42532: stdout chunk (state=3): >>>service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 29946 1726882607.42558: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 29946 1726882607.42575: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 29946 1726882607.42583: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 29946 1726882607.44065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882607.44069: stderr chunk (state=3): >>><<< 29946 1726882607.44072: stdout chunk (state=3): >>><<< 29946 1726882607.44101: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882607.44843: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882607.44847: _low_level_execute_command(): starting 29946 1726882607.44854: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882605.7864501-31444-144459233602554/ > /dev/null 2>&1 && sleep 0' 29946 1726882607.45422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882607.45443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882607.45446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.45502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.45514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.45567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.47465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882607.47469: stdout chunk (state=3): >>><<< 29946 1726882607.47471: stderr chunk (state=3): >>><<< 29946 1726882607.47558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882607.47562: handler run complete 29946 1726882607.47696: variable 'ansible_facts' from source: unknown 29946 1726882607.47794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882607.48545: variable 'ansible_facts' from source: unknown 29946 1726882607.48677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882607.48888: attempt loop complete, returning result 29946 1726882607.49098: _execute() done 29946 1726882607.49101: dumping result to json 29946 1726882607.49104: done dumping result, returning 29946 1726882607.49106: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-95e7-9dfb-0000000005e3] 29946 1726882607.49108: sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e3 29946 1726882607.50157: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e3 29946 1726882607.50161: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882607.50274: no more pending results, returning what we have 29946 1726882607.50277: results queue empty 29946 1726882607.50278: checking for any_errors_fatal 29946 1726882607.50288: done checking for any_errors_fatal 29946 1726882607.50290: checking for max_fail_percentage 29946 1726882607.50291: done checking for max_fail_percentage 29946 1726882607.50292: checking to see if all hosts have failed and the running result is not ok 29946 1726882607.50297: done checking to see if all hosts have failed 29946 1726882607.50298: getting the remaining hosts for this loop 29946 1726882607.50299: done getting the remaining hosts for this loop 29946 1726882607.50303: getting the next task for host managed_node2 29946 1726882607.50310: done getting next task for host managed_node2 29946 1726882607.50316: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882607.50320: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882607.50331: getting variables 29946 1726882607.50332: in VariableManager get_vars() 29946 1726882607.50378: Calling all_inventory to load vars for managed_node2 29946 1726882607.50383: Calling groups_inventory to load vars for managed_node2 29946 1726882607.50388: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882607.50474: Calling all_plugins_play to load vars for managed_node2 29946 1726882607.50478: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882607.50482: Calling groups_plugins_play to load vars for managed_node2 29946 1726882607.52663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882607.54346: done with get_vars() 29946 1726882607.54370: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:36:47 -0400 (0:00:01.821) 0:00:33.654 ****** 29946 1726882607.54483: entering _queue_task() for managed_node2/package_facts 29946 1726882607.54821: worker is 1 (out of 1 available) 29946 1726882607.54835: exiting _queue_task() for managed_node2/package_facts 29946 1726882607.54845: done queuing things up, now waiting for results queue to drain 29946 1726882607.54847: waiting for pending results... 29946 1726882607.55084: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 29946 1726882607.55165: in run() - task 12673a56-9f93-95e7-9dfb-0000000005e4 29946 1726882607.55177: variable 'ansible_search_path' from source: unknown 29946 1726882607.55181: variable 'ansible_search_path' from source: unknown 29946 1726882607.55217: calling self._execute() 29946 1726882607.55292: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882607.55299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882607.55312: variable 'omit' from source: magic vars 29946 1726882607.55587: variable 'ansible_distribution_major_version' from source: facts 29946 1726882607.55602: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882607.55608: variable 'omit' from source: magic vars 29946 1726882607.55646: variable 'omit' from source: magic vars 29946 1726882607.55672: variable 'omit' from source: magic vars 29946 1726882607.55707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882607.55733: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882607.55753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882607.55766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882607.55777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882607.55803: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882607.55807: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882607.55810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882607.55883: Set connection var ansible_pipelining to False 29946 1726882607.55887: Set connection var ansible_shell_executable to /bin/sh 29946 1726882607.55896: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882607.55901: Set connection var ansible_timeout to 10 29946 1726882607.55908: Set connection var ansible_shell_type to sh 29946 1726882607.55910: Set connection var ansible_connection to ssh 29946 1726882607.55927: variable 'ansible_shell_executable' from source: unknown 29946 1726882607.55930: variable 'ansible_connection' from source: unknown 29946 1726882607.55933: variable 'ansible_module_compression' from source: unknown 29946 1726882607.55935: variable 'ansible_shell_type' from source: unknown 29946 1726882607.55937: variable 'ansible_shell_executable' from source: unknown 29946 1726882607.55940: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882607.55944: variable 'ansible_pipelining' from source: unknown 29946 1726882607.55946: variable 'ansible_timeout' from source: unknown 29946 1726882607.55950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882607.56101: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882607.56110: variable 'omit' from source: magic vars 29946 1726882607.56115: starting attempt loop 29946 1726882607.56118: running the handler 29946 1726882607.56129: _low_level_execute_command(): starting 29946 1726882607.56136: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882607.56689: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.56694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.56697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.56700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.56762: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882607.56765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.56767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.56830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.58601: stdout chunk (state=3): >>>/root <<< 29946 1726882607.58791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882607.58797: stdout chunk (state=3): >>><<< 29946 1726882607.58804: stderr chunk (state=3): >>><<< 29946 1726882607.58998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882607.59004: _low_level_execute_command(): starting 29946 1726882607.59009: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677 `" && echo ansible-tmp-1726882607.5886059-31511-125316285674677="` echo /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677 `" ) && sleep 0' 29946 1726882607.59598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882607.59636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.59656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.59747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.61635: stdout chunk (state=3): >>>ansible-tmp-1726882607.5886059-31511-125316285674677=/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677 <<< 29946 1726882607.61846: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882607.61855: stdout chunk (state=3): >>><<< 29946 1726882607.61873: stderr chunk (state=3): >>><<< 29946 1726882607.61945: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882607.5886059-31511-125316285674677=/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882607.61959: variable 'ansible_module_compression' from source: unknown 29946 1726882607.62021: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 29946 1726882607.62111: variable 'ansible_facts' from source: unknown 29946 1726882607.62307: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py 29946 1726882607.62500: Sending initial data 29946 1726882607.62504: Sent initial data (162 bytes) 29946 1726882607.63151: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.63215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.63284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882607.63304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.63344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.63506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.65054: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882607.65153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882607.65235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpoidx38hn /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py <<< 29946 1726882607.65238: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py" <<< 29946 1726882607.65321: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpoidx38hn" to remote "/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py" <<< 29946 1726882607.68258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882607.68262: stdout chunk (state=3): >>><<< 29946 1726882607.68264: stderr chunk (state=3): >>><<< 29946 1726882607.68267: done transferring module to remote 29946 1726882607.68269: _low_level_execute_command(): starting 29946 1726882607.68315: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/ /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py && sleep 0' 29946 1726882607.69044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.69150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.69177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.69366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.69439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.69551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882607.71324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882607.71374: stderr chunk (state=3): >>><<< 29946 1726882607.71377: stdout chunk (state=3): >>><<< 29946 1726882607.71424: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882607.71428: _low_level_execute_command(): starting 29946 1726882607.71430: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/AnsiballZ_package_facts.py && sleep 0' 29946 1726882607.72198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882607.72214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882607.72229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882607.72255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882607.72276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882607.72289: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882607.72374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882607.72415: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882607.72470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882607.72637: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882607.72709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882608.15995: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 29946 1726882608.16035: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 29946 1726882608.16152: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 29946 1726882608.16229: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 29946 1726882608.16248: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 29946 1726882608.17926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882608.17967: stderr chunk (state=3): >>><<< 29946 1726882608.17975: stdout chunk (state=3): >>><<< 29946 1726882608.18018: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882608.24262: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882608.24275: _low_level_execute_command(): starting 29946 1726882608.24280: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882607.5886059-31511-125316285674677/ > /dev/null 2>&1 && sleep 0' 29946 1726882608.24758: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882608.24761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882608.24763: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882608.24766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882608.24819: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882608.24822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882608.24828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882608.24889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882608.26761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882608.26787: stderr chunk (state=3): >>><<< 29946 1726882608.26792: stdout chunk (state=3): >>><<< 29946 1726882608.26819: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882608.26822: handler run complete 29946 1726882608.27420: variable 'ansible_facts' from source: unknown 29946 1726882608.27681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.29314: variable 'ansible_facts' from source: unknown 29946 1726882608.29527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.29943: attempt loop complete, returning result 29946 1726882608.29953: _execute() done 29946 1726882608.29957: dumping result to json 29946 1726882608.30072: done dumping result, returning 29946 1726882608.30097: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-95e7-9dfb-0000000005e4] 29946 1726882608.30103: sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e4 29946 1726882608.37142: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000005e4 29946 1726882608.37145: WORKER PROCESS EXITING ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882608.37242: no more pending results, returning what we have 29946 1726882608.37245: results queue empty 29946 1726882608.37246: checking for any_errors_fatal 29946 1726882608.37249: done checking for any_errors_fatal 29946 1726882608.37250: checking for max_fail_percentage 29946 1726882608.37251: done checking for max_fail_percentage 29946 1726882608.37251: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.37252: done checking to see if all hosts have failed 29946 1726882608.37253: getting the remaining hosts for this loop 29946 1726882608.37254: done getting the remaining hosts for this loop 29946 1726882608.37256: getting the next task for host managed_node2 29946 1726882608.37261: done getting next task for host managed_node2 29946 1726882608.37263: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882608.37265: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.37272: getting variables 29946 1726882608.37273: in VariableManager get_vars() 29946 1726882608.37292: Calling all_inventory to load vars for managed_node2 29946 1726882608.37299: Calling groups_inventory to load vars for managed_node2 29946 1726882608.37301: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.37308: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.37310: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.37312: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.40011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.43257: done with get_vars() 29946 1726882608.43288: done getting variables 29946 1726882608.43735: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:36:48 -0400 (0:00:00.892) 0:00:34.547 ****** 29946 1726882608.43765: entering _queue_task() for managed_node2/debug 29946 1726882608.44312: worker is 1 (out of 1 available) 29946 1726882608.44323: exiting _queue_task() for managed_node2/debug 29946 1726882608.44333: done queuing things up, now waiting for results queue to drain 29946 1726882608.44335: waiting for pending results... 29946 1726882608.44410: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider 29946 1726882608.44537: in run() - task 12673a56-9f93-95e7-9dfb-000000000092 29946 1726882608.44564: variable 'ansible_search_path' from source: unknown 29946 1726882608.44572: variable 'ansible_search_path' from source: unknown 29946 1726882608.44619: calling self._execute() 29946 1726882608.44730: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.44744: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.44759: variable 'omit' from source: magic vars 29946 1726882608.45360: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.45423: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882608.45434: variable 'omit' from source: magic vars 29946 1726882608.45723: variable 'omit' from source: magic vars 29946 1726882608.46198: variable 'network_provider' from source: set_fact 29946 1726882608.46202: variable 'omit' from source: magic vars 29946 1726882608.46204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882608.46207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882608.46209: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882608.46212: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882608.46214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882608.46404: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882608.46413: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.46423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.46533: Set connection var ansible_pipelining to False 29946 1726882608.46799: Set connection var ansible_shell_executable to /bin/sh 29946 1726882608.46803: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882608.46805: Set connection var ansible_timeout to 10 29946 1726882608.46808: Set connection var ansible_shell_type to sh 29946 1726882608.46810: Set connection var ansible_connection to ssh 29946 1726882608.46812: variable 'ansible_shell_executable' from source: unknown 29946 1726882608.46815: variable 'ansible_connection' from source: unknown 29946 1726882608.46818: variable 'ansible_module_compression' from source: unknown 29946 1726882608.46820: variable 'ansible_shell_type' from source: unknown 29946 1726882608.46822: variable 'ansible_shell_executable' from source: unknown 29946 1726882608.46824: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.46827: variable 'ansible_pipelining' from source: unknown 29946 1726882608.46829: variable 'ansible_timeout' from source: unknown 29946 1726882608.46831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.46949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882608.47297: variable 'omit' from source: magic vars 29946 1726882608.47300: starting attempt loop 29946 1726882608.47303: running the handler 29946 1726882608.47305: handler run complete 29946 1726882608.47307: attempt loop complete, returning result 29946 1726882608.47309: _execute() done 29946 1726882608.47311: dumping result to json 29946 1726882608.47314: done dumping result, returning 29946 1726882608.47317: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-95e7-9dfb-000000000092] 29946 1726882608.47320: sending task result for task 12673a56-9f93-95e7-9dfb-000000000092 29946 1726882608.47429: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000092 ok: [managed_node2] => {} MSG: Using network provider: nm 29946 1726882608.47494: WORKER PROCESS EXITING 29946 1726882608.47504: no more pending results, returning what we have 29946 1726882608.47508: results queue empty 29946 1726882608.47509: checking for any_errors_fatal 29946 1726882608.47523: done checking for any_errors_fatal 29946 1726882608.47524: checking for max_fail_percentage 29946 1726882608.47526: done checking for max_fail_percentage 29946 1726882608.47527: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.47528: done checking to see if all hosts have failed 29946 1726882608.47528: getting the remaining hosts for this loop 29946 1726882608.47530: done getting the remaining hosts for this loop 29946 1726882608.47544: getting the next task for host managed_node2 29946 1726882608.47552: done getting next task for host managed_node2 29946 1726882608.47556: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882608.47558: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.47569: getting variables 29946 1726882608.47570: in VariableManager get_vars() 29946 1726882608.47714: Calling all_inventory to load vars for managed_node2 29946 1726882608.47717: Calling groups_inventory to load vars for managed_node2 29946 1726882608.47719: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.47729: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.47732: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.47734: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.50632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.53837: done with get_vars() 29946 1726882608.53865: done getting variables 29946 1726882608.54134: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:36:48 -0400 (0:00:00.103) 0:00:34.651 ****** 29946 1726882608.54167: entering _queue_task() for managed_node2/fail 29946 1726882608.54990: worker is 1 (out of 1 available) 29946 1726882608.55002: exiting _queue_task() for managed_node2/fail 29946 1726882608.55012: done queuing things up, now waiting for results queue to drain 29946 1726882608.55014: waiting for pending results... 29946 1726882608.55432: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 29946 1726882608.55902: in run() - task 12673a56-9f93-95e7-9dfb-000000000093 29946 1726882608.55907: variable 'ansible_search_path' from source: unknown 29946 1726882608.55911: variable 'ansible_search_path' from source: unknown 29946 1726882608.55913: calling self._execute() 29946 1726882608.55916: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.56199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.56203: variable 'omit' from source: magic vars 29946 1726882608.56832: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.56852: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882608.56980: variable 'network_state' from source: role '' defaults 29946 1726882608.57111: Evaluated conditional (network_state != {}): False 29946 1726882608.57205: when evaluation is False, skipping this task 29946 1726882608.57215: _execute() done 29946 1726882608.57225: dumping result to json 29946 1726882608.57233: done dumping result, returning 29946 1726882608.57245: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-95e7-9dfb-000000000093] 29946 1726882608.57257: sending task result for task 12673a56-9f93-95e7-9dfb-000000000093 skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882608.57416: no more pending results, returning what we have 29946 1726882608.57420: results queue empty 29946 1726882608.57421: checking for any_errors_fatal 29946 1726882608.57430: done checking for any_errors_fatal 29946 1726882608.57431: checking for max_fail_percentage 29946 1726882608.57433: done checking for max_fail_percentage 29946 1726882608.57434: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.57435: done checking to see if all hosts have failed 29946 1726882608.57436: getting the remaining hosts for this loop 29946 1726882608.57437: done getting the remaining hosts for this loop 29946 1726882608.57441: getting the next task for host managed_node2 29946 1726882608.57446: done getting next task for host managed_node2 29946 1726882608.57450: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882608.57453: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.57470: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000093 29946 1726882608.57474: WORKER PROCESS EXITING 29946 1726882608.57480: getting variables 29946 1726882608.57482: in VariableManager get_vars() 29946 1726882608.57519: Calling all_inventory to load vars for managed_node2 29946 1726882608.57521: Calling groups_inventory to load vars for managed_node2 29946 1726882608.57523: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.57534: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.57536: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.57539: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.61471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.65609: done with get_vars() 29946 1726882608.65639: done getting variables 29946 1726882608.65699: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:36:48 -0400 (0:00:00.115) 0:00:34.766 ****** 29946 1726882608.65730: entering _queue_task() for managed_node2/fail 29946 1726882608.66057: worker is 1 (out of 1 available) 29946 1726882608.66070: exiting _queue_task() for managed_node2/fail 29946 1726882608.66082: done queuing things up, now waiting for results queue to drain 29946 1726882608.66083: waiting for pending results... 29946 1726882608.66355: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 29946 1726882608.66467: in run() - task 12673a56-9f93-95e7-9dfb-000000000094 29946 1726882608.66488: variable 'ansible_search_path' from source: unknown 29946 1726882608.66499: variable 'ansible_search_path' from source: unknown 29946 1726882608.66541: calling self._execute() 29946 1726882608.66643: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.67001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.67005: variable 'omit' from source: magic vars 29946 1726882608.67420: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.67512: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882608.67691: variable 'network_state' from source: role '' defaults 29946 1726882608.67774: Evaluated conditional (network_state != {}): False 29946 1726882608.67782: when evaluation is False, skipping this task 29946 1726882608.67789: _execute() done 29946 1726882608.67798: dumping result to json 29946 1726882608.67805: done dumping result, returning 29946 1726882608.67815: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-95e7-9dfb-000000000094] 29946 1726882608.67876: sending task result for task 12673a56-9f93-95e7-9dfb-000000000094 29946 1726882608.68100: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000094 29946 1726882608.68104: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882608.68149: no more pending results, returning what we have 29946 1726882608.68153: results queue empty 29946 1726882608.68154: checking for any_errors_fatal 29946 1726882608.68160: done checking for any_errors_fatal 29946 1726882608.68161: checking for max_fail_percentage 29946 1726882608.68163: done checking for max_fail_percentage 29946 1726882608.68164: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.68165: done checking to see if all hosts have failed 29946 1726882608.68165: getting the remaining hosts for this loop 29946 1726882608.68167: done getting the remaining hosts for this loop 29946 1726882608.68171: getting the next task for host managed_node2 29946 1726882608.68177: done getting next task for host managed_node2 29946 1726882608.68180: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882608.68183: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.68198: getting variables 29946 1726882608.68200: in VariableManager get_vars() 29946 1726882608.68237: Calling all_inventory to load vars for managed_node2 29946 1726882608.68239: Calling groups_inventory to load vars for managed_node2 29946 1726882608.68242: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.68253: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.68257: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.68259: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.70603: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.73667: done with get_vars() 29946 1726882608.73691: done getting variables 29946 1726882608.73750: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:36:48 -0400 (0:00:00.080) 0:00:34.847 ****** 29946 1726882608.73781: entering _queue_task() for managed_node2/fail 29946 1726882608.74106: worker is 1 (out of 1 available) 29946 1726882608.74119: exiting _queue_task() for managed_node2/fail 29946 1726882608.74131: done queuing things up, now waiting for results queue to drain 29946 1726882608.74133: waiting for pending results... 29946 1726882608.74515: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 29946 1726882608.74546: in run() - task 12673a56-9f93-95e7-9dfb-000000000095 29946 1726882608.74568: variable 'ansible_search_path' from source: unknown 29946 1726882608.74576: variable 'ansible_search_path' from source: unknown 29946 1726882608.74625: calling self._execute() 29946 1726882608.74739: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.74756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.74773: variable 'omit' from source: magic vars 29946 1726882608.75172: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.75190: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882608.75368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882608.77660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882608.77700: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882608.77738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882608.77779: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882608.77878: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882608.77901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.77952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.77990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.78037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.78055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.78159: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.78181: Evaluated conditional (ansible_distribution_major_version | int > 9): True 29946 1726882608.78297: variable 'ansible_distribution' from source: facts 29946 1726882608.78312: variable '__network_rh_distros' from source: role '' defaults 29946 1726882608.78326: Evaluated conditional (ansible_distribution in __network_rh_distros): True 29946 1726882608.78633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.78636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.78638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.78659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.78677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.78725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.78755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.78781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.78824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.78842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.78889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.78918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.78945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.78990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.79067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.79313: variable 'network_connections' from source: play vars 29946 1726882608.79328: variable 'profile' from source: play vars 29946 1726882608.79404: variable 'profile' from source: play vars 29946 1726882608.79413: variable 'interface' from source: set_fact 29946 1726882608.79469: variable 'interface' from source: set_fact 29946 1726882608.79485: variable 'network_state' from source: role '' defaults 29946 1726882608.79555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882608.79732: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882608.79774: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882608.79810: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882608.79898: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882608.79901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882608.79940: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882608.79972: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.80006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882608.80040: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 29946 1726882608.80053: when evaluation is False, skipping this task 29946 1726882608.80155: _execute() done 29946 1726882608.80158: dumping result to json 29946 1726882608.80160: done dumping result, returning 29946 1726882608.80163: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-95e7-9dfb-000000000095] 29946 1726882608.80165: sending task result for task 12673a56-9f93-95e7-9dfb-000000000095 29946 1726882608.80240: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000095 29946 1726882608.80243: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 29946 1726882608.80287: no more pending results, returning what we have 29946 1726882608.80291: results queue empty 29946 1726882608.80292: checking for any_errors_fatal 29946 1726882608.80298: done checking for any_errors_fatal 29946 1726882608.80299: checking for max_fail_percentage 29946 1726882608.80301: done checking for max_fail_percentage 29946 1726882608.80302: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.80303: done checking to see if all hosts have failed 29946 1726882608.80304: getting the remaining hosts for this loop 29946 1726882608.80305: done getting the remaining hosts for this loop 29946 1726882608.80309: getting the next task for host managed_node2 29946 1726882608.80314: done getting next task for host managed_node2 29946 1726882608.80318: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882608.80320: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.80333: getting variables 29946 1726882608.80334: in VariableManager get_vars() 29946 1726882608.80373: Calling all_inventory to load vars for managed_node2 29946 1726882608.80375: Calling groups_inventory to load vars for managed_node2 29946 1726882608.80377: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.80388: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.80390: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.80395: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.81985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.84250: done with get_vars() 29946 1726882608.84287: done getting variables 29946 1726882608.84355: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:36:48 -0400 (0:00:00.106) 0:00:34.953 ****** 29946 1726882608.84390: entering _queue_task() for managed_node2/dnf 29946 1726882608.85150: worker is 1 (out of 1 available) 29946 1726882608.85163: exiting _queue_task() for managed_node2/dnf 29946 1726882608.85175: done queuing things up, now waiting for results queue to drain 29946 1726882608.85176: waiting for pending results... 29946 1726882608.85910: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 29946 1726882608.85915: in run() - task 12673a56-9f93-95e7-9dfb-000000000096 29946 1726882608.85919: variable 'ansible_search_path' from source: unknown 29946 1726882608.86004: variable 'ansible_search_path' from source: unknown 29946 1726882608.86258: calling self._execute() 29946 1726882608.86516: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882608.86623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882608.86627: variable 'omit' from source: magic vars 29946 1726882608.87085: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.87180: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882608.87616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882608.91942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882608.92050: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882608.92103: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882608.92148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882608.92186: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882608.92276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.92708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.92744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.92795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.92819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.92954: variable 'ansible_distribution' from source: facts 29946 1726882608.92968: variable 'ansible_distribution_major_version' from source: facts 29946 1726882608.92989: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 29946 1726882608.93120: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882608.93265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.93300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.93335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.93480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.93484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.93487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.93489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.93502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.93546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.93571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.93637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882608.93749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882608.93796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.93852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882608.93935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882608.94059: variable 'network_connections' from source: play vars 29946 1726882608.94077: variable 'profile' from source: play vars 29946 1726882608.94158: variable 'profile' from source: play vars 29946 1726882608.94168: variable 'interface' from source: set_fact 29946 1726882608.94231: variable 'interface' from source: set_fact 29946 1726882608.94320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882608.94500: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882608.94543: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882608.94683: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882608.94688: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882608.94691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882608.94694: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882608.94729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882608.94756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882608.94808: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882608.95058: variable 'network_connections' from source: play vars 29946 1726882608.95079: variable 'profile' from source: play vars 29946 1726882608.95252: variable 'profile' from source: play vars 29946 1726882608.95254: variable 'interface' from source: set_fact 29946 1726882608.95280: variable 'interface' from source: set_fact 29946 1726882608.95310: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882608.95317: when evaluation is False, skipping this task 29946 1726882608.95324: _execute() done 29946 1726882608.95331: dumping result to json 29946 1726882608.95338: done dumping result, returning 29946 1726882608.95348: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000096] 29946 1726882608.95361: sending task result for task 12673a56-9f93-95e7-9dfb-000000000096 29946 1726882608.95617: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000096 29946 1726882608.95620: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882608.95673: no more pending results, returning what we have 29946 1726882608.95676: results queue empty 29946 1726882608.95677: checking for any_errors_fatal 29946 1726882608.95684: done checking for any_errors_fatal 29946 1726882608.95685: checking for max_fail_percentage 29946 1726882608.95686: done checking for max_fail_percentage 29946 1726882608.95688: checking to see if all hosts have failed and the running result is not ok 29946 1726882608.95688: done checking to see if all hosts have failed 29946 1726882608.95689: getting the remaining hosts for this loop 29946 1726882608.95691: done getting the remaining hosts for this loop 29946 1726882608.95697: getting the next task for host managed_node2 29946 1726882608.95708: done getting next task for host managed_node2 29946 1726882608.95712: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882608.95714: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882608.95726: getting variables 29946 1726882608.95728: in VariableManager get_vars() 29946 1726882608.95763: Calling all_inventory to load vars for managed_node2 29946 1726882608.95765: Calling groups_inventory to load vars for managed_node2 29946 1726882608.95767: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882608.95777: Calling all_plugins_play to load vars for managed_node2 29946 1726882608.95779: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882608.95782: Calling groups_plugins_play to load vars for managed_node2 29946 1726882608.97638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882608.99971: done with get_vars() 29946 1726882609.00051: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 29946 1726882609.00165: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:36:49 -0400 (0:00:00.158) 0:00:35.111 ****** 29946 1726882609.00207: entering _queue_task() for managed_node2/yum 29946 1726882609.00573: worker is 1 (out of 1 available) 29946 1726882609.00587: exiting _queue_task() for managed_node2/yum 29946 1726882609.00601: done queuing things up, now waiting for results queue to drain 29946 1726882609.00603: waiting for pending results... 29946 1726882609.00872: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 29946 1726882609.00970: in run() - task 12673a56-9f93-95e7-9dfb-000000000097 29946 1726882609.01077: variable 'ansible_search_path' from source: unknown 29946 1726882609.01080: variable 'ansible_search_path' from source: unknown 29946 1726882609.01083: calling self._execute() 29946 1726882609.01144: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.01157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.01173: variable 'omit' from source: magic vars 29946 1726882609.01554: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.01574: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.02200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882609.05133: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882609.05210: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882609.05255: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882609.05304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882609.05333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882609.05418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.05465: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.05498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.05549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.05568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.05675: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.05698: Evaluated conditional (ansible_distribution_major_version | int < 8): False 29946 1726882609.05706: when evaluation is False, skipping this task 29946 1726882609.05712: _execute() done 29946 1726882609.05726: dumping result to json 29946 1726882609.05733: done dumping result, returning 29946 1726882609.05746: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000097] 29946 1726882609.05835: sending task result for task 12673a56-9f93-95e7-9dfb-000000000097 29946 1726882609.05911: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000097 29946 1726882609.05914: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 29946 1726882609.05989: no more pending results, returning what we have 29946 1726882609.05992: results queue empty 29946 1726882609.05995: checking for any_errors_fatal 29946 1726882609.06002: done checking for any_errors_fatal 29946 1726882609.06005: checking for max_fail_percentage 29946 1726882609.06007: done checking for max_fail_percentage 29946 1726882609.06008: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.06009: done checking to see if all hosts have failed 29946 1726882609.06010: getting the remaining hosts for this loop 29946 1726882609.06012: done getting the remaining hosts for this loop 29946 1726882609.06016: getting the next task for host managed_node2 29946 1726882609.06023: done getting next task for host managed_node2 29946 1726882609.06027: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882609.06029: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.06044: getting variables 29946 1726882609.06045: in VariableManager get_vars() 29946 1726882609.06087: Calling all_inventory to load vars for managed_node2 29946 1726882609.06090: Calling groups_inventory to load vars for managed_node2 29946 1726882609.06092: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.06109: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.06112: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.06115: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.07900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.09573: done with get_vars() 29946 1726882609.09606: done getting variables 29946 1726882609.09667: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:36:49 -0400 (0:00:00.095) 0:00:35.206 ****** 29946 1726882609.09705: entering _queue_task() for managed_node2/fail 29946 1726882609.10225: worker is 1 (out of 1 available) 29946 1726882609.10236: exiting _queue_task() for managed_node2/fail 29946 1726882609.10246: done queuing things up, now waiting for results queue to drain 29946 1726882609.10248: waiting for pending results... 29946 1726882609.10487: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 29946 1726882609.10495: in run() - task 12673a56-9f93-95e7-9dfb-000000000098 29946 1726882609.10508: variable 'ansible_search_path' from source: unknown 29946 1726882609.10516: variable 'ansible_search_path' from source: unknown 29946 1726882609.10556: calling self._execute() 29946 1726882609.10666: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.10679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.10704: variable 'omit' from source: magic vars 29946 1726882609.11091: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.11113: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.11243: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.11441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882609.13984: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882609.14057: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882609.14107: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882609.14183: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882609.14186: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882609.14259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.14302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.14327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.14363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.14379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.14498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.14503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.14505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.14621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.14624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.14626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.14627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.14629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.14656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.14671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.14845: variable 'network_connections' from source: play vars 29946 1726882609.14860: variable 'profile' from source: play vars 29946 1726882609.14946: variable 'profile' from source: play vars 29946 1726882609.14949: variable 'interface' from source: set_fact 29946 1726882609.15045: variable 'interface' from source: set_fact 29946 1726882609.15081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882609.15262: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882609.15328: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882609.15356: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882609.15360: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882609.15433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882609.15436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882609.15451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.15476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882609.15528: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882609.15798: variable 'network_connections' from source: play vars 29946 1726882609.15801: variable 'profile' from source: play vars 29946 1726882609.15830: variable 'profile' from source: play vars 29946 1726882609.15833: variable 'interface' from source: set_fact 29946 1726882609.15889: variable 'interface' from source: set_fact 29946 1726882609.15931: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882609.15934: when evaluation is False, skipping this task 29946 1726882609.15936: _execute() done 29946 1726882609.15939: dumping result to json 29946 1726882609.15941: done dumping result, returning 29946 1726882609.15943: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-000000000098] 29946 1726882609.15951: sending task result for task 12673a56-9f93-95e7-9dfb-000000000098 skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882609.16187: no more pending results, returning what we have 29946 1726882609.16191: results queue empty 29946 1726882609.16192: checking for any_errors_fatal 29946 1726882609.16200: done checking for any_errors_fatal 29946 1726882609.16201: checking for max_fail_percentage 29946 1726882609.16203: done checking for max_fail_percentage 29946 1726882609.16204: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.16205: done checking to see if all hosts have failed 29946 1726882609.16205: getting the remaining hosts for this loop 29946 1726882609.16207: done getting the remaining hosts for this loop 29946 1726882609.16210: getting the next task for host managed_node2 29946 1726882609.16217: done getting next task for host managed_node2 29946 1726882609.16220: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 29946 1726882609.16222: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.16235: getting variables 29946 1726882609.16236: in VariableManager get_vars() 29946 1726882609.16274: Calling all_inventory to load vars for managed_node2 29946 1726882609.16276: Calling groups_inventory to load vars for managed_node2 29946 1726882609.16278: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.16287: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.16290: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.16292: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.16906: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000098 29946 1726882609.16910: WORKER PROCESS EXITING 29946 1726882609.17982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.19130: done with get_vars() 29946 1726882609.19148: done getting variables 29946 1726882609.19190: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:36:49 -0400 (0:00:00.095) 0:00:35.301 ****** 29946 1726882609.19215: entering _queue_task() for managed_node2/package 29946 1726882609.19457: worker is 1 (out of 1 available) 29946 1726882609.19470: exiting _queue_task() for managed_node2/package 29946 1726882609.19482: done queuing things up, now waiting for results queue to drain 29946 1726882609.19484: waiting for pending results... 29946 1726882609.19664: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages 29946 1726882609.19739: in run() - task 12673a56-9f93-95e7-9dfb-000000000099 29946 1726882609.19751: variable 'ansible_search_path' from source: unknown 29946 1726882609.19754: variable 'ansible_search_path' from source: unknown 29946 1726882609.19781: calling self._execute() 29946 1726882609.19863: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.19869: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.19877: variable 'omit' from source: magic vars 29946 1726882609.20178: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.20189: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.20327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882609.20559: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882609.20696: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882609.20700: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882609.20710: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882609.20785: variable 'network_packages' from source: role '' defaults 29946 1726882609.20861: variable '__network_provider_setup' from source: role '' defaults 29946 1726882609.20870: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882609.20928: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882609.20931: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882609.20974: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882609.21086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882609.22640: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882609.22682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882609.22717: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882609.22759: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882609.22781: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882609.22841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.22861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.22883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.22913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.22923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.22954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.22970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.22990: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.23017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.23027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.23174: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882609.23248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.23264: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.23280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.23312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.23323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.23382: variable 'ansible_python' from source: facts 29946 1726882609.23406: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882609.23461: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882609.23518: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882609.23612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.23629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.23647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.23671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.23682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.23718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.23737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.23755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.23780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.23791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.23889: variable 'network_connections' from source: play vars 29946 1726882609.23916: variable 'profile' from source: play vars 29946 1726882609.24008: variable 'profile' from source: play vars 29946 1726882609.24020: variable 'interface' from source: set_fact 29946 1726882609.24299: variable 'interface' from source: set_fact 29946 1726882609.24302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882609.24305: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882609.24307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.24309: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882609.24311: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.24574: variable 'network_connections' from source: play vars 29946 1726882609.24585: variable 'profile' from source: play vars 29946 1726882609.24686: variable 'profile' from source: play vars 29946 1726882609.24704: variable 'interface' from source: set_fact 29946 1726882609.24772: variable 'interface' from source: set_fact 29946 1726882609.24817: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882609.24900: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.25162: variable 'network_connections' from source: play vars 29946 1726882609.25173: variable 'profile' from source: play vars 29946 1726882609.25224: variable 'profile' from source: play vars 29946 1726882609.25227: variable 'interface' from source: set_fact 29946 1726882609.25297: variable 'interface' from source: set_fact 29946 1726882609.25316: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882609.25370: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882609.25565: variable 'network_connections' from source: play vars 29946 1726882609.25568: variable 'profile' from source: play vars 29946 1726882609.25618: variable 'profile' from source: play vars 29946 1726882609.25621: variable 'interface' from source: set_fact 29946 1726882609.25689: variable 'interface' from source: set_fact 29946 1726882609.25730: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882609.25773: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882609.25777: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882609.25823: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882609.25957: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882609.26253: variable 'network_connections' from source: play vars 29946 1726882609.26256: variable 'profile' from source: play vars 29946 1726882609.26303: variable 'profile' from source: play vars 29946 1726882609.26307: variable 'interface' from source: set_fact 29946 1726882609.26352: variable 'interface' from source: set_fact 29946 1726882609.26359: variable 'ansible_distribution' from source: facts 29946 1726882609.26362: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.26368: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.26379: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882609.26489: variable 'ansible_distribution' from source: facts 29946 1726882609.26492: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.26510: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.26522: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882609.26628: variable 'ansible_distribution' from source: facts 29946 1726882609.26632: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.26637: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.26664: variable 'network_provider' from source: set_fact 29946 1726882609.26676: variable 'ansible_facts' from source: unknown 29946 1726882609.27698: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 29946 1726882609.27702: when evaluation is False, skipping this task 29946 1726882609.27704: _execute() done 29946 1726882609.27707: dumping result to json 29946 1726882609.27710: done dumping result, returning 29946 1726882609.27715: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-95e7-9dfb-000000000099] 29946 1726882609.27718: sending task result for task 12673a56-9f93-95e7-9dfb-000000000099 29946 1726882609.27781: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000099 29946 1726882609.27783: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 29946 1726882609.27863: no more pending results, returning what we have 29946 1726882609.27866: results queue empty 29946 1726882609.27867: checking for any_errors_fatal 29946 1726882609.27872: done checking for any_errors_fatal 29946 1726882609.27873: checking for max_fail_percentage 29946 1726882609.27874: done checking for max_fail_percentage 29946 1726882609.27875: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.27876: done checking to see if all hosts have failed 29946 1726882609.27876: getting the remaining hosts for this loop 29946 1726882609.27877: done getting the remaining hosts for this loop 29946 1726882609.27880: getting the next task for host managed_node2 29946 1726882609.27885: done getting next task for host managed_node2 29946 1726882609.27888: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882609.27890: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.27904: getting variables 29946 1726882609.27905: in VariableManager get_vars() 29946 1726882609.27938: Calling all_inventory to load vars for managed_node2 29946 1726882609.27940: Calling groups_inventory to load vars for managed_node2 29946 1726882609.27943: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.27957: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.27959: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.27962: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.30175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.33182: done with get_vars() 29946 1726882609.33208: done getting variables 29946 1726882609.33269: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:36:49 -0400 (0:00:00.140) 0:00:35.442 ****** 29946 1726882609.33302: entering _queue_task() for managed_node2/package 29946 1726882609.33663: worker is 1 (out of 1 available) 29946 1726882609.33678: exiting _queue_task() for managed_node2/package 29946 1726882609.33691: done queuing things up, now waiting for results queue to drain 29946 1726882609.33695: waiting for pending results... 29946 1726882609.33961: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 29946 1726882609.34080: in run() - task 12673a56-9f93-95e7-9dfb-00000000009a 29946 1726882609.34109: variable 'ansible_search_path' from source: unknown 29946 1726882609.34122: variable 'ansible_search_path' from source: unknown 29946 1726882609.34166: calling self._execute() 29946 1726882609.34284: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.34305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.34320: variable 'omit' from source: magic vars 29946 1726882609.34721: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.34741: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.34869: variable 'network_state' from source: role '' defaults 29946 1726882609.34887: Evaluated conditional (network_state != {}): False 29946 1726882609.34901: when evaluation is False, skipping this task 29946 1726882609.34909: _execute() done 29946 1726882609.34916: dumping result to json 29946 1726882609.34924: done dumping result, returning 29946 1726882609.34936: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-95e7-9dfb-00000000009a] 29946 1726882609.34946: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009a skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882609.35110: no more pending results, returning what we have 29946 1726882609.35114: results queue empty 29946 1726882609.35115: checking for any_errors_fatal 29946 1726882609.35123: done checking for any_errors_fatal 29946 1726882609.35124: checking for max_fail_percentage 29946 1726882609.35126: done checking for max_fail_percentage 29946 1726882609.35127: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.35128: done checking to see if all hosts have failed 29946 1726882609.35129: getting the remaining hosts for this loop 29946 1726882609.35130: done getting the remaining hosts for this loop 29946 1726882609.35135: getting the next task for host managed_node2 29946 1726882609.35141: done getting next task for host managed_node2 29946 1726882609.35146: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882609.35149: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.35166: getting variables 29946 1726882609.35168: in VariableManager get_vars() 29946 1726882609.35215: Calling all_inventory to load vars for managed_node2 29946 1726882609.35218: Calling groups_inventory to load vars for managed_node2 29946 1726882609.35221: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.35234: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.35237: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.35240: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.36106: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009a 29946 1726882609.36110: WORKER PROCESS EXITING 29946 1726882609.36937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.38583: done with get_vars() 29946 1726882609.38610: done getting variables 29946 1726882609.38666: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:36:49 -0400 (0:00:00.053) 0:00:35.496 ****** 29946 1726882609.38700: entering _queue_task() for managed_node2/package 29946 1726882609.39006: worker is 1 (out of 1 available) 29946 1726882609.39018: exiting _queue_task() for managed_node2/package 29946 1726882609.39031: done queuing things up, now waiting for results queue to drain 29946 1726882609.39032: waiting for pending results... 29946 1726882609.39308: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 29946 1726882609.39432: in run() - task 12673a56-9f93-95e7-9dfb-00000000009b 29946 1726882609.39457: variable 'ansible_search_path' from source: unknown 29946 1726882609.39466: variable 'ansible_search_path' from source: unknown 29946 1726882609.39507: calling self._execute() 29946 1726882609.39614: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.39628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.39644: variable 'omit' from source: magic vars 29946 1726882609.40040: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.40056: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.40174: variable 'network_state' from source: role '' defaults 29946 1726882609.40207: Evaluated conditional (network_state != {}): False 29946 1726882609.40301: when evaluation is False, skipping this task 29946 1726882609.40304: _execute() done 29946 1726882609.40307: dumping result to json 29946 1726882609.40309: done dumping result, returning 29946 1726882609.40311: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-95e7-9dfb-00000000009b] 29946 1726882609.40314: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009b 29946 1726882609.40495: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009b 29946 1726882609.40501: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882609.40550: no more pending results, returning what we have 29946 1726882609.40554: results queue empty 29946 1726882609.40555: checking for any_errors_fatal 29946 1726882609.40564: done checking for any_errors_fatal 29946 1726882609.40565: checking for max_fail_percentage 29946 1726882609.40566: done checking for max_fail_percentage 29946 1726882609.40567: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.40568: done checking to see if all hosts have failed 29946 1726882609.40569: getting the remaining hosts for this loop 29946 1726882609.40570: done getting the remaining hosts for this loop 29946 1726882609.40574: getting the next task for host managed_node2 29946 1726882609.40580: done getting next task for host managed_node2 29946 1726882609.40583: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882609.40586: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.40710: getting variables 29946 1726882609.40712: in VariableManager get_vars() 29946 1726882609.40751: Calling all_inventory to load vars for managed_node2 29946 1726882609.40754: Calling groups_inventory to load vars for managed_node2 29946 1726882609.40757: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.40769: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.40772: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.40775: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.42401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.44005: done with get_vars() 29946 1726882609.44031: done getting variables 29946 1726882609.44086: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:36:49 -0400 (0:00:00.054) 0:00:35.550 ****** 29946 1726882609.44121: entering _queue_task() for managed_node2/service 29946 1726882609.44439: worker is 1 (out of 1 available) 29946 1726882609.44450: exiting _queue_task() for managed_node2/service 29946 1726882609.44462: done queuing things up, now waiting for results queue to drain 29946 1726882609.44463: waiting for pending results... 29946 1726882609.44742: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 29946 1726882609.44850: in run() - task 12673a56-9f93-95e7-9dfb-00000000009c 29946 1726882609.44869: variable 'ansible_search_path' from source: unknown 29946 1726882609.44877: variable 'ansible_search_path' from source: unknown 29946 1726882609.44924: calling self._execute() 29946 1726882609.45031: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.45098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.45101: variable 'omit' from source: magic vars 29946 1726882609.45440: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.45462: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.45597: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.45807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882609.48820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882609.48897: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882609.48936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882609.48978: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882609.49087: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882609.49121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.49167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.49206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.49251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.49268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.49326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.49354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.49415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.49431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.49448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.49491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.49642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.49646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.49648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.49650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.49824: variable 'network_connections' from source: play vars 29946 1726882609.49840: variable 'profile' from source: play vars 29946 1726882609.49927: variable 'profile' from source: play vars 29946 1726882609.49937: variable 'interface' from source: set_fact 29946 1726882609.50008: variable 'interface' from source: set_fact 29946 1726882609.50087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882609.50259: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882609.50308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882609.50338: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882609.50365: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882609.50412: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882609.50436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882609.50459: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.50484: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882609.50576: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882609.50941: variable 'network_connections' from source: play vars 29946 1726882609.50946: variable 'profile' from source: play vars 29946 1726882609.51209: variable 'profile' from source: play vars 29946 1726882609.51213: variable 'interface' from source: set_fact 29946 1726882609.51215: variable 'interface' from source: set_fact 29946 1726882609.51218: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 29946 1726882609.51319: when evaluation is False, skipping this task 29946 1726882609.51328: _execute() done 29946 1726882609.51337: dumping result to json 29946 1726882609.51344: done dumping result, returning 29946 1726882609.51356: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-95e7-9dfb-00000000009c] 29946 1726882609.51375: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009c 29946 1726882609.51660: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009c 29946 1726882609.51663: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 29946 1726882609.51711: no more pending results, returning what we have 29946 1726882609.51714: results queue empty 29946 1726882609.51715: checking for any_errors_fatal 29946 1726882609.51722: done checking for any_errors_fatal 29946 1726882609.51723: checking for max_fail_percentage 29946 1726882609.51725: done checking for max_fail_percentage 29946 1726882609.51725: checking to see if all hosts have failed and the running result is not ok 29946 1726882609.51726: done checking to see if all hosts have failed 29946 1726882609.51727: getting the remaining hosts for this loop 29946 1726882609.51728: done getting the remaining hosts for this loop 29946 1726882609.51731: getting the next task for host managed_node2 29946 1726882609.51737: done getting next task for host managed_node2 29946 1726882609.51740: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882609.51742: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882609.51754: getting variables 29946 1726882609.51755: in VariableManager get_vars() 29946 1726882609.51794: Calling all_inventory to load vars for managed_node2 29946 1726882609.51797: Calling groups_inventory to load vars for managed_node2 29946 1726882609.51799: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882609.51810: Calling all_plugins_play to load vars for managed_node2 29946 1726882609.51812: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882609.51815: Calling groups_plugins_play to load vars for managed_node2 29946 1726882609.53965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882609.55621: done with get_vars() 29946 1726882609.55644: done getting variables 29946 1726882609.55711: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:36:49 -0400 (0:00:00.116) 0:00:35.666 ****** 29946 1726882609.55740: entering _queue_task() for managed_node2/service 29946 1726882609.56068: worker is 1 (out of 1 available) 29946 1726882609.56078: exiting _queue_task() for managed_node2/service 29946 1726882609.56196: done queuing things up, now waiting for results queue to drain 29946 1726882609.56198: waiting for pending results... 29946 1726882609.56510: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 29946 1726882609.56514: in run() - task 12673a56-9f93-95e7-9dfb-00000000009d 29946 1726882609.56517: variable 'ansible_search_path' from source: unknown 29946 1726882609.56520: variable 'ansible_search_path' from source: unknown 29946 1726882609.56548: calling self._execute() 29946 1726882609.56653: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.56664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.56678: variable 'omit' from source: magic vars 29946 1726882609.57054: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.57076: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882609.57246: variable 'network_provider' from source: set_fact 29946 1726882609.57257: variable 'network_state' from source: role '' defaults 29946 1726882609.57270: Evaluated conditional (network_provider == "nm" or network_state != {}): True 29946 1726882609.57279: variable 'omit' from source: magic vars 29946 1726882609.57327: variable 'omit' from source: magic vars 29946 1726882609.57359: variable 'network_service_name' from source: role '' defaults 29946 1726882609.57439: variable 'network_service_name' from source: role '' defaults 29946 1726882609.57552: variable '__network_provider_setup' from source: role '' defaults 29946 1726882609.57624: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882609.57645: variable '__network_service_name_default_nm' from source: role '' defaults 29946 1726882609.57659: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882609.57732: variable '__network_packages_default_nm' from source: role '' defaults 29946 1726882609.57970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882609.61339: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882609.61420: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882609.61456: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882609.61509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882609.61538: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882609.61618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.61648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.61674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.61721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.61736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.61825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.61828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.61830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.61865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.61880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.62105: variable '__network_packages_default_gobject_packages' from source: role '' defaults 29946 1726882609.62215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.62239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.62266: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.62308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.62323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.62468: variable 'ansible_python' from source: facts 29946 1726882609.62474: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 29946 1726882609.62506: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882609.62560: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882609.62646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.62663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.62679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.62709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.62720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.62752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882609.62772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882609.62788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.62818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882609.62828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882609.62919: variable 'network_connections' from source: play vars 29946 1726882609.62926: variable 'profile' from source: play vars 29946 1726882609.62976: variable 'profile' from source: play vars 29946 1726882609.62981: variable 'interface' from source: set_fact 29946 1726882609.63041: variable 'interface' from source: set_fact 29946 1726882609.63285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882609.63315: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882609.63498: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882609.63502: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882609.63504: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882609.63507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882609.63531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882609.63561: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882609.63592: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882609.63641: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.64186: variable 'network_connections' from source: play vars 29946 1726882609.64197: variable 'profile' from source: play vars 29946 1726882609.64278: variable 'profile' from source: play vars 29946 1726882609.64281: variable 'interface' from source: set_fact 29946 1726882609.64344: variable 'interface' from source: set_fact 29946 1726882609.64375: variable '__network_packages_default_wireless' from source: role '' defaults 29946 1726882609.64456: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882609.64751: variable 'network_connections' from source: play vars 29946 1726882609.64756: variable 'profile' from source: play vars 29946 1726882609.64831: variable 'profile' from source: play vars 29946 1726882609.64927: variable 'interface' from source: set_fact 29946 1726882609.64931: variable 'interface' from source: set_fact 29946 1726882609.64935: variable '__network_packages_default_team' from source: role '' defaults 29946 1726882609.65018: variable '__network_team_connections_defined' from source: role '' defaults 29946 1726882609.65314: variable 'network_connections' from source: play vars 29946 1726882609.65318: variable 'profile' from source: play vars 29946 1726882609.65386: variable 'profile' from source: play vars 29946 1726882609.65389: variable 'interface' from source: set_fact 29946 1726882609.65469: variable 'interface' from source: set_fact 29946 1726882609.65527: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882609.65588: variable '__network_service_name_default_initscripts' from source: role '' defaults 29946 1726882609.65591: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882609.65657: variable '__network_packages_default_initscripts' from source: role '' defaults 29946 1726882609.65878: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 29946 1726882609.67016: variable 'network_connections' from source: play vars 29946 1726882609.67019: variable 'profile' from source: play vars 29946 1726882609.67072: variable 'profile' from source: play vars 29946 1726882609.67081: variable 'interface' from source: set_fact 29946 1726882609.67153: variable 'interface' from source: set_fact 29946 1726882609.67274: variable 'ansible_distribution' from source: facts 29946 1726882609.67278: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.67280: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.67282: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 29946 1726882609.67389: variable 'ansible_distribution' from source: facts 29946 1726882609.67401: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.67412: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.67429: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 29946 1726882609.67608: variable 'ansible_distribution' from source: facts 29946 1726882609.67620: variable '__network_rh_distros' from source: role '' defaults 29946 1726882609.67630: variable 'ansible_distribution_major_version' from source: facts 29946 1726882609.67670: variable 'network_provider' from source: set_fact 29946 1726882609.67701: variable 'omit' from source: magic vars 29946 1726882609.67739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882609.67768: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882609.67791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882609.67823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882609.67842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882609.67875: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882609.67886: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.67897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.67999: Set connection var ansible_pipelining to False 29946 1726882609.68039: Set connection var ansible_shell_executable to /bin/sh 29946 1726882609.68041: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882609.68044: Set connection var ansible_timeout to 10 29946 1726882609.68046: Set connection var ansible_shell_type to sh 29946 1726882609.68047: Set connection var ansible_connection to ssh 29946 1726882609.68069: variable 'ansible_shell_executable' from source: unknown 29946 1726882609.68077: variable 'ansible_connection' from source: unknown 29946 1726882609.68149: variable 'ansible_module_compression' from source: unknown 29946 1726882609.68152: variable 'ansible_shell_type' from source: unknown 29946 1726882609.68155: variable 'ansible_shell_executable' from source: unknown 29946 1726882609.68157: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882609.68163: variable 'ansible_pipelining' from source: unknown 29946 1726882609.68165: variable 'ansible_timeout' from source: unknown 29946 1726882609.68167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882609.68240: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882609.68265: variable 'omit' from source: magic vars 29946 1726882609.68277: starting attempt loop 29946 1726882609.68288: running the handler 29946 1726882609.68411: variable 'ansible_facts' from source: unknown 29946 1726882609.69291: _low_level_execute_command(): starting 29946 1726882609.69307: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882609.70120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.70142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882609.70159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882609.70225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882609.70304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882609.72103: stdout chunk (state=3): >>>/root <<< 29946 1726882609.72119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882609.72175: stderr chunk (state=3): >>><<< 29946 1726882609.72192: stdout chunk (state=3): >>><<< 29946 1726882609.72220: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882609.72238: _low_level_execute_command(): starting 29946 1726882609.72248: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562 `" && echo ansible-tmp-1726882609.7222664-31610-95725974731562="` echo /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562 `" ) && sleep 0' 29946 1726882609.72853: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882609.72856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.72859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882609.72861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.72937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882609.72940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882609.73024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882609.75198: stdout chunk (state=3): >>>ansible-tmp-1726882609.7222664-31610-95725974731562=/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562 <<< 29946 1726882609.75201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882609.75204: stderr chunk (state=3): >>><<< 29946 1726882609.75206: stdout chunk (state=3): >>><<< 29946 1726882609.75208: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882609.7222664-31610-95725974731562=/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882609.75211: variable 'ansible_module_compression' from source: unknown 29946 1726882609.75260: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 29946 1726882609.75412: variable 'ansible_facts' from source: unknown 29946 1726882609.75664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py 29946 1726882609.75847: Sending initial data 29946 1726882609.75854: Sent initial data (155 bytes) 29946 1726882609.76418: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882609.76428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882609.76438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882609.76452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882609.76464: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882609.76506: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.76570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882609.76594: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882609.76678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882609.78414: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882609.78470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882609.78532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpysywtmkp /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py <<< 29946 1726882609.78545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py" <<< 29946 1726882609.78599: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpysywtmkp" to remote "/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py" <<< 29946 1726882609.80475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882609.80537: stderr chunk (state=3): >>><<< 29946 1726882609.80644: stdout chunk (state=3): >>><<< 29946 1726882609.80647: done transferring module to remote 29946 1726882609.80649: _low_level_execute_command(): starting 29946 1726882609.80652: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/ /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py && sleep 0' 29946 1726882609.81185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882609.81211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882609.81250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882609.81266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882609.81358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882609.81384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882609.81404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882609.81523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882609.83411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882609.83415: stdout chunk (state=3): >>><<< 29946 1726882609.83417: stderr chunk (state=3): >>><<< 29946 1726882609.83420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882609.83422: _low_level_execute_command(): starting 29946 1726882609.83424: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/AnsiballZ_systemd.py && sleep 0' 29946 1726882609.84509: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.84610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882609.84707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882609.84727: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882609.84879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882610.13518: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299987456", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1570239000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 29946 1726882610.15110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882610.15127: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 29946 1726882610.15186: stderr chunk (state=3): >>><<< 29946 1726882610.15226: stdout chunk (state=3): >>><<< 29946 1726882610.15436: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "6947", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainStartTimestampMonotonic": "260736749", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ExecMainHandoffTimestampMonotonic": "260753620", "ExecMainPID": "6947", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "4605", "MemoryCurrent": "4657152", "MemoryPeak": "7507968", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3299987456", "EffectiveMemoryMax": "3702878208", "EffectiveMemoryHigh": "3702878208", "CPUUsageNSec": "1570239000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "cloud-init.service NetworkManager-wait-online.service network.target shutdown.target multi-user.target", "After": "basic.target cloud-init-local.service dbus-broker.service system.slice network-pre.target systemd-journald.socket sysinit.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:25 EDT", "StateChangeTimestampMonotonic": "355353338", "InactiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveExitTimestampMonotonic": "260738404", "ActiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveEnterTimestampMonotonic": "260824743", "ActiveExitTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ActiveExitTimestampMonotonic": "260719627", "InactiveEnterTimestamp": "Fri 2024-09-20 21:27:50 EDT", "InactiveEnterTimestampMonotonic": "260732561", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:27:50 EDT", "ConditionTimestampMonotonic": "260735742", "AssertTimestamp": "Fri 2024-09-20 21:27:50 EDT", "AssertTimestampMonotonic": "260735751", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "02f7cf7a90d5486687dc572c7e50e205", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882610.15607: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882610.16098: _low_level_execute_command(): starting 29946 1726882610.16102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882609.7222664-31610-95725974731562/ > /dev/null 2>&1 && sleep 0' 29946 1726882610.17424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882610.17699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882610.17911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882610.18221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882610.20120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882610.20123: stdout chunk (state=3): >>><<< 29946 1726882610.20125: stderr chunk (state=3): >>><<< 29946 1726882610.20298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882610.20301: handler run complete 29946 1726882610.20303: attempt loop complete, returning result 29946 1726882610.20305: _execute() done 29946 1726882610.20307: dumping result to json 29946 1726882610.20308: done dumping result, returning 29946 1726882610.20310: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-95e7-9dfb-00000000009d] 29946 1726882610.20312: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009d ok: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882610.20922: no more pending results, returning what we have 29946 1726882610.20925: results queue empty 29946 1726882610.20926: checking for any_errors_fatal 29946 1726882610.20932: done checking for any_errors_fatal 29946 1726882610.20932: checking for max_fail_percentage 29946 1726882610.20934: done checking for max_fail_percentage 29946 1726882610.20935: checking to see if all hosts have failed and the running result is not ok 29946 1726882610.20936: done checking to see if all hosts have failed 29946 1726882610.20936: getting the remaining hosts for this loop 29946 1726882610.20938: done getting the remaining hosts for this loop 29946 1726882610.20941: getting the next task for host managed_node2 29946 1726882610.20947: done getting next task for host managed_node2 29946 1726882610.20950: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882610.20952: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882610.20961: getting variables 29946 1726882610.20962: in VariableManager get_vars() 29946 1726882610.21034: Calling all_inventory to load vars for managed_node2 29946 1726882610.21037: Calling groups_inventory to load vars for managed_node2 29946 1726882610.21039: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882610.21051: Calling all_plugins_play to load vars for managed_node2 29946 1726882610.21053: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882610.21056: Calling groups_plugins_play to load vars for managed_node2 29946 1726882610.21699: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009d 29946 1726882610.21703: WORKER PROCESS EXITING 29946 1726882610.24178: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882610.27404: done with get_vars() 29946 1726882610.27430: done getting variables 29946 1726882610.27695: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:36:50 -0400 (0:00:00.719) 0:00:36.386 ****** 29946 1726882610.27728: entering _queue_task() for managed_node2/service 29946 1726882610.28288: worker is 1 (out of 1 available) 29946 1726882610.28702: exiting _queue_task() for managed_node2/service 29946 1726882610.28712: done queuing things up, now waiting for results queue to drain 29946 1726882610.28713: waiting for pending results... 29946 1726882610.28904: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 29946 1726882610.29082: in run() - task 12673a56-9f93-95e7-9dfb-00000000009e 29946 1726882610.29172: variable 'ansible_search_path' from source: unknown 29946 1726882610.29181: variable 'ansible_search_path' from source: unknown 29946 1726882610.29227: calling self._execute() 29946 1726882610.29498: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882610.29515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882610.29529: variable 'omit' from source: magic vars 29946 1726882610.30292: variable 'ansible_distribution_major_version' from source: facts 29946 1726882610.30394: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882610.30631: variable 'network_provider' from source: set_fact 29946 1726882610.30644: Evaluated conditional (network_provider == "nm"): True 29946 1726882610.30848: variable '__network_wpa_supplicant_required' from source: role '' defaults 29946 1726882610.31134: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 29946 1726882610.31450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882610.35849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882610.35922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882610.35970: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882610.36013: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882610.36047: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882610.36270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882610.36322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882610.36427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882610.36472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882610.36623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882610.36669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882610.36697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882610.36817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882610.36864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882610.36884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882610.37151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882610.37154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882610.37156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882610.37158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882610.37159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882610.37519: variable 'network_connections' from source: play vars 29946 1726882610.37536: variable 'profile' from source: play vars 29946 1726882610.37728: variable 'profile' from source: play vars 29946 1726882610.37737: variable 'interface' from source: set_fact 29946 1726882610.37998: variable 'interface' from source: set_fact 29946 1726882610.38001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 29946 1726882610.38360: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 29946 1726882610.38502: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 29946 1726882610.38537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 29946 1726882610.38900: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 29946 1726882610.38903: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 29946 1726882610.38905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 29946 1726882610.38907: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882610.38931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 29946 1726882610.38980: variable '__network_wireless_connections_defined' from source: role '' defaults 29946 1726882610.39559: variable 'network_connections' from source: play vars 29946 1726882610.39569: variable 'profile' from source: play vars 29946 1726882610.39643: variable 'profile' from source: play vars 29946 1726882610.39651: variable 'interface' from source: set_fact 29946 1726882610.39720: variable 'interface' from source: set_fact 29946 1726882610.39827: Evaluated conditional (__network_wpa_supplicant_required): False 29946 1726882610.39883: when evaluation is False, skipping this task 29946 1726882610.39894: _execute() done 29946 1726882610.39911: dumping result to json 29946 1726882610.39918: done dumping result, returning 29946 1726882610.39930: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-95e7-9dfb-00000000009e] 29946 1726882610.39939: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009e skipping: [managed_node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 29946 1726882610.40137: no more pending results, returning what we have 29946 1726882610.40141: results queue empty 29946 1726882610.40142: checking for any_errors_fatal 29946 1726882610.40161: done checking for any_errors_fatal 29946 1726882610.40161: checking for max_fail_percentage 29946 1726882610.40163: done checking for max_fail_percentage 29946 1726882610.40164: checking to see if all hosts have failed and the running result is not ok 29946 1726882610.40165: done checking to see if all hosts have failed 29946 1726882610.40166: getting the remaining hosts for this loop 29946 1726882610.40167: done getting the remaining hosts for this loop 29946 1726882610.40171: getting the next task for host managed_node2 29946 1726882610.40177: done getting next task for host managed_node2 29946 1726882610.40181: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882610.40183: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882610.40201: getting variables 29946 1726882610.40203: in VariableManager get_vars() 29946 1726882610.40242: Calling all_inventory to load vars for managed_node2 29946 1726882610.40245: Calling groups_inventory to load vars for managed_node2 29946 1726882610.40247: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882610.40258: Calling all_plugins_play to load vars for managed_node2 29946 1726882610.40261: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882610.40264: Calling groups_plugins_play to load vars for managed_node2 29946 1726882610.41101: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009e 29946 1726882610.41104: WORKER PROCESS EXITING 29946 1726882610.43287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882610.48786: done with get_vars() 29946 1726882610.49138: done getting variables 29946 1726882610.49203: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:36:50 -0400 (0:00:00.217) 0:00:36.604 ****** 29946 1726882610.49517: entering _queue_task() for managed_node2/service 29946 1726882610.50076: worker is 1 (out of 1 available) 29946 1726882610.50092: exiting _queue_task() for managed_node2/service 29946 1726882610.50106: done queuing things up, now waiting for results queue to drain 29946 1726882610.50107: waiting for pending results... 29946 1726882610.50799: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service 29946 1726882610.51248: in run() - task 12673a56-9f93-95e7-9dfb-00000000009f 29946 1726882610.51252: variable 'ansible_search_path' from source: unknown 29946 1726882610.51255: variable 'ansible_search_path' from source: unknown 29946 1726882610.51258: calling self._execute() 29946 1726882610.51332: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882610.51408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882610.51475: variable 'omit' from source: magic vars 29946 1726882610.52172: variable 'ansible_distribution_major_version' from source: facts 29946 1726882610.52241: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882610.52515: variable 'network_provider' from source: set_fact 29946 1726882610.52528: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882610.52558: when evaluation is False, skipping this task 29946 1726882610.52566: _execute() done 29946 1726882610.52576: dumping result to json 29946 1726882610.52698: done dumping result, returning 29946 1726882610.52702: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-95e7-9dfb-00000000009f] 29946 1726882610.52704: sending task result for task 12673a56-9f93-95e7-9dfb-00000000009f 29946 1726882610.52888: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000009f 29946 1726882610.52892: WORKER PROCESS EXITING skipping: [managed_node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 29946 1726882610.52940: no more pending results, returning what we have 29946 1726882610.52944: results queue empty 29946 1726882610.52945: checking for any_errors_fatal 29946 1726882610.52954: done checking for any_errors_fatal 29946 1726882610.52955: checking for max_fail_percentage 29946 1726882610.52957: done checking for max_fail_percentage 29946 1726882610.52958: checking to see if all hosts have failed and the running result is not ok 29946 1726882610.52959: done checking to see if all hosts have failed 29946 1726882610.52959: getting the remaining hosts for this loop 29946 1726882610.52961: done getting the remaining hosts for this loop 29946 1726882610.52964: getting the next task for host managed_node2 29946 1726882610.52971: done getting next task for host managed_node2 29946 1726882610.52974: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882610.52978: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882610.52996: getting variables 29946 1726882610.52998: in VariableManager get_vars() 29946 1726882610.53035: Calling all_inventory to load vars for managed_node2 29946 1726882610.53037: Calling groups_inventory to load vars for managed_node2 29946 1726882610.53039: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882610.53050: Calling all_plugins_play to load vars for managed_node2 29946 1726882610.53052: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882610.53054: Calling groups_plugins_play to load vars for managed_node2 29946 1726882610.55949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882610.60010: done with get_vars() 29946 1726882610.60039: done getting variables 29946 1726882610.60219: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:36:50 -0400 (0:00:00.107) 0:00:36.712 ****** 29946 1726882610.60252: entering _queue_task() for managed_node2/copy 29946 1726882610.60943: worker is 1 (out of 1 available) 29946 1726882610.61073: exiting _queue_task() for managed_node2/copy 29946 1726882610.61084: done queuing things up, now waiting for results queue to drain 29946 1726882610.61086: waiting for pending results... 29946 1726882610.61608: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 29946 1726882610.61614: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a0 29946 1726882610.61620: variable 'ansible_search_path' from source: unknown 29946 1726882610.61623: variable 'ansible_search_path' from source: unknown 29946 1726882610.61999: calling self._execute() 29946 1726882610.62002: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882610.62005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882610.62007: variable 'omit' from source: magic vars 29946 1726882610.62942: variable 'ansible_distribution_major_version' from source: facts 29946 1726882610.62960: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882610.63332: variable 'network_provider' from source: set_fact 29946 1726882610.63343: Evaluated conditional (network_provider == "initscripts"): False 29946 1726882610.63351: when evaluation is False, skipping this task 29946 1726882610.63359: _execute() done 29946 1726882610.63366: dumping result to json 29946 1726882610.63373: done dumping result, returning 29946 1726882610.63384: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-95e7-9dfb-0000000000a0] 29946 1726882610.63402: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a0 skipping: [managed_node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 29946 1726882610.63551: no more pending results, returning what we have 29946 1726882610.63555: results queue empty 29946 1726882610.63556: checking for any_errors_fatal 29946 1726882610.63564: done checking for any_errors_fatal 29946 1726882610.63565: checking for max_fail_percentage 29946 1726882610.63568: done checking for max_fail_percentage 29946 1726882610.63569: checking to see if all hosts have failed and the running result is not ok 29946 1726882610.63570: done checking to see if all hosts have failed 29946 1726882610.63571: getting the remaining hosts for this loop 29946 1726882610.63573: done getting the remaining hosts for this loop 29946 1726882610.63577: getting the next task for host managed_node2 29946 1726882610.63584: done getting next task for host managed_node2 29946 1726882610.63588: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882610.63590: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882610.63809: getting variables 29946 1726882610.63811: in VariableManager get_vars() 29946 1726882610.63848: Calling all_inventory to load vars for managed_node2 29946 1726882610.63850: Calling groups_inventory to load vars for managed_node2 29946 1726882610.63852: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882610.63865: Calling all_plugins_play to load vars for managed_node2 29946 1726882610.63868: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882610.63871: Calling groups_plugins_play to load vars for managed_node2 29946 1726882610.64506: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a0 29946 1726882610.64510: WORKER PROCESS EXITING 29946 1726882610.67446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882610.69560: done with get_vars() 29946 1726882610.69585: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:36:50 -0400 (0:00:00.094) 0:00:36.806 ****** 29946 1726882610.69704: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882610.70469: worker is 1 (out of 1 available) 29946 1726882610.70480: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_connections 29946 1726882610.70492: done queuing things up, now waiting for results queue to drain 29946 1726882610.70797: waiting for pending results... 29946 1726882610.70916: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 29946 1726882610.71256: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a1 29946 1726882610.71317: variable 'ansible_search_path' from source: unknown 29946 1726882610.71321: variable 'ansible_search_path' from source: unknown 29946 1726882610.71413: calling self._execute() 29946 1726882610.71697: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882610.71701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882610.71705: variable 'omit' from source: magic vars 29946 1726882610.72468: variable 'ansible_distribution_major_version' from source: facts 29946 1726882610.72481: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882610.72487: variable 'omit' from source: magic vars 29946 1726882610.72605: variable 'omit' from source: magic vars 29946 1726882610.72855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 29946 1726882610.77566: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 29946 1726882610.77571: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 29946 1726882610.77604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 29946 1726882610.77649: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 29946 1726882610.77816: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 29946 1726882610.78100: variable 'network_provider' from source: set_fact 29946 1726882610.78155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 29946 1726882610.78350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 29946 1726882610.78382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 29946 1726882610.78511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 29946 1726882610.78537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 29946 1726882610.78619: variable 'omit' from source: magic vars 29946 1726882610.78986: variable 'omit' from source: magic vars 29946 1726882610.79209: variable 'network_connections' from source: play vars 29946 1726882610.79227: variable 'profile' from source: play vars 29946 1726882610.79598: variable 'profile' from source: play vars 29946 1726882610.79602: variable 'interface' from source: set_fact 29946 1726882610.79604: variable 'interface' from source: set_fact 29946 1726882610.79847: variable 'omit' from source: magic vars 29946 1726882610.79859: variable '__lsr_ansible_managed' from source: task vars 29946 1726882610.79923: variable '__lsr_ansible_managed' from source: task vars 29946 1726882610.80436: Loaded config def from plugin (lookup/template) 29946 1726882610.80701: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 29946 1726882610.80704: File lookup term: get_ansible_managed.j2 29946 1726882610.80707: variable 'ansible_search_path' from source: unknown 29946 1726882610.80709: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 29946 1726882610.80714: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 29946 1726882610.80718: variable 'ansible_search_path' from source: unknown 29946 1726882611.12268: variable 'ansible_managed' from source: unknown 29946 1726882611.12519: variable 'omit' from source: magic vars 29946 1726882611.12746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882611.12776: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882611.12802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882611.12843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.12909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.12938: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882611.12983: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.13006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.13222: Set connection var ansible_pipelining to False 29946 1726882611.13234: Set connection var ansible_shell_executable to /bin/sh 29946 1726882611.13245: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882611.13364: Set connection var ansible_timeout to 10 29946 1726882611.13367: Set connection var ansible_shell_type to sh 29946 1726882611.13369: Set connection var ansible_connection to ssh 29946 1726882611.13371: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.13373: variable 'ansible_connection' from source: unknown 29946 1726882611.13374: variable 'ansible_module_compression' from source: unknown 29946 1726882611.13376: variable 'ansible_shell_type' from source: unknown 29946 1726882611.13378: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.13379: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.13381: variable 'ansible_pipelining' from source: unknown 29946 1726882611.13383: variable 'ansible_timeout' from source: unknown 29946 1726882611.13519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.13811: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882611.13822: variable 'omit' from source: magic vars 29946 1726882611.13824: starting attempt loop 29946 1726882611.13827: running the handler 29946 1726882611.13829: _low_level_execute_command(): starting 29946 1726882611.13830: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882611.15266: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882611.15412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.15496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882611.15577: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.15620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.15763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.17473: stdout chunk (state=3): >>>/root <<< 29946 1726882611.17606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882611.17642: stderr chunk (state=3): >>><<< 29946 1726882611.17657: stdout chunk (state=3): >>><<< 29946 1726882611.17687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882611.17709: _low_level_execute_command(): starting 29946 1726882611.17720: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644 `" && echo ansible-tmp-1726882611.1769798-31671-267767361326644="` echo /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644 `" ) && sleep 0' 29946 1726882611.18947: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.19005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882611.19068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.19107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.19224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.21086: stdout chunk (state=3): >>>ansible-tmp-1726882611.1769798-31671-267767361326644=/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644 <<< 29946 1726882611.21222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882611.21238: stdout chunk (state=3): >>><<< 29946 1726882611.21249: stderr chunk (state=3): >>><<< 29946 1726882611.21270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882611.1769798-31671-267767361326644=/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882611.21326: variable 'ansible_module_compression' from source: unknown 29946 1726882611.21453: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 29946 1726882611.21456: variable 'ansible_facts' from source: unknown 29946 1726882611.22004: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py 29946 1726882611.22127: Sending initial data 29946 1726882611.22137: Sent initial data (168 bytes) 29946 1726882611.22916: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882611.22932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882611.23000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.23062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882611.23079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.23327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.23428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.24957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882611.25020: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882611.25092: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpp67jeowe /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py <<< 29946 1726882611.25104: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py" <<< 29946 1726882611.25165: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpp67jeowe" to remote "/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py" <<< 29946 1726882611.26974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882611.27356: stderr chunk (state=3): >>><<< 29946 1726882611.27359: stdout chunk (state=3): >>><<< 29946 1726882611.27362: done transferring module to remote 29946 1726882611.27364: _low_level_execute_command(): starting 29946 1726882611.27366: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/ /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py && sleep 0' 29946 1726882611.28424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882611.28428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882611.28434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882611.28455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882611.28507: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.28615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882611.28628: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.28638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.28728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.30645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882611.30649: stdout chunk (state=3): >>><<< 29946 1726882611.30655: stderr chunk (state=3): >>><<< 29946 1726882611.30670: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882611.30673: _low_level_execute_command(): starting 29946 1726882611.30678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/AnsiballZ_network_connections.py && sleep 0' 29946 1726882611.31220: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882611.31225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882611.31271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882611.31277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882611.31279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.31281: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882611.31283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882611.31285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.31317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.31321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.31421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.58359: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3egn5x15/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3egn5x15/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/40a117ec-4b75-4c1f-bad4-81df3058e541: error=unknown <<< 29946 1726882611.58610: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 29946 1726882611.60277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882611.60292: stdout chunk (state=3): >>><<< 29946 1726882611.60308: stderr chunk (state=3): >>><<< 29946 1726882611.60404: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3egn5x15/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_3egn5x15/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/40a117ec-4b75-4c1f-bad4-81df3058e541: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882611.60535: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882611.60538: _low_level_execute_command(): starting 29946 1726882611.60541: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882611.1769798-31671-267767361326644/ > /dev/null 2>&1 && sleep 0' 29946 1726882611.61208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882611.61212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882611.61229: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882611.61235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.61247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882611.61253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882611.61299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882611.61354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882611.61360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882611.61400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882611.61458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882611.63386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882611.63389: stdout chunk (state=3): >>><<< 29946 1726882611.63391: stderr chunk (state=3): >>><<< 29946 1726882611.63618: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882611.63621: handler run complete 29946 1726882611.63624: attempt loop complete, returning result 29946 1726882611.63626: _execute() done 29946 1726882611.63628: dumping result to json 29946 1726882611.63635: done dumping result, returning 29946 1726882611.63638: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-95e7-9dfb-0000000000a1] 29946 1726882611.63640: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a1 changed: [managed_node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 29946 1726882611.64052: no more pending results, returning what we have 29946 1726882611.64058: results queue empty 29946 1726882611.64059: checking for any_errors_fatal 29946 1726882611.64065: done checking for any_errors_fatal 29946 1726882611.64066: checking for max_fail_percentage 29946 1726882611.64067: done checking for max_fail_percentage 29946 1726882611.64068: checking to see if all hosts have failed and the running result is not ok 29946 1726882611.64069: done checking to see if all hosts have failed 29946 1726882611.64070: getting the remaining hosts for this loop 29946 1726882611.64071: done getting the remaining hosts for this loop 29946 1726882611.64075: getting the next task for host managed_node2 29946 1726882611.64081: done getting next task for host managed_node2 29946 1726882611.64085: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882611.64087: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882611.64403: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a1 29946 1726882611.64406: WORKER PROCESS EXITING 29946 1726882611.64421: getting variables 29946 1726882611.64423: in VariableManager get_vars() 29946 1726882611.64516: Calling all_inventory to load vars for managed_node2 29946 1726882611.64519: Calling groups_inventory to load vars for managed_node2 29946 1726882611.64522: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882611.64602: Calling all_plugins_play to load vars for managed_node2 29946 1726882611.64645: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882611.64650: Calling groups_plugins_play to load vars for managed_node2 29946 1726882611.66861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882611.68583: done with get_vars() 29946 1726882611.68608: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:36:51 -0400 (0:00:00.989) 0:00:37.796 ****** 29946 1726882611.68698: entering _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882611.69403: worker is 1 (out of 1 available) 29946 1726882611.69410: exiting _queue_task() for managed_node2/fedora.linux_system_roles.network_state 29946 1726882611.69419: done queuing things up, now waiting for results queue to drain 29946 1726882611.69421: waiting for pending results... 29946 1726882611.69473: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state 29946 1726882611.69589: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a2 29946 1726882611.69614: variable 'ansible_search_path' from source: unknown 29946 1726882611.69622: variable 'ansible_search_path' from source: unknown 29946 1726882611.69759: calling self._execute() 29946 1726882611.69780: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.69796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.69811: variable 'omit' from source: magic vars 29946 1726882611.70796: variable 'ansible_distribution_major_version' from source: facts 29946 1726882611.70840: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882611.71163: variable 'network_state' from source: role '' defaults 29946 1726882611.71285: Evaluated conditional (network_state != {}): False 29946 1726882611.71289: when evaluation is False, skipping this task 29946 1726882611.71291: _execute() done 29946 1726882611.71297: dumping result to json 29946 1726882611.71299: done dumping result, returning 29946 1726882611.71302: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-95e7-9dfb-0000000000a2] 29946 1726882611.71304: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a2 29946 1726882611.71699: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a2 29946 1726882611.71702: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 29946 1726882611.71756: no more pending results, returning what we have 29946 1726882611.71761: results queue empty 29946 1726882611.71762: checking for any_errors_fatal 29946 1726882611.71774: done checking for any_errors_fatal 29946 1726882611.71774: checking for max_fail_percentage 29946 1726882611.71776: done checking for max_fail_percentage 29946 1726882611.71777: checking to see if all hosts have failed and the running result is not ok 29946 1726882611.71778: done checking to see if all hosts have failed 29946 1726882611.71779: getting the remaining hosts for this loop 29946 1726882611.71780: done getting the remaining hosts for this loop 29946 1726882611.71785: getting the next task for host managed_node2 29946 1726882611.71790: done getting next task for host managed_node2 29946 1726882611.71796: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882611.71799: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882611.71816: getting variables 29946 1726882611.71818: in VariableManager get_vars() 29946 1726882611.71854: Calling all_inventory to load vars for managed_node2 29946 1726882611.71857: Calling groups_inventory to load vars for managed_node2 29946 1726882611.71859: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882611.71872: Calling all_plugins_play to load vars for managed_node2 29946 1726882611.71875: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882611.71878: Calling groups_plugins_play to load vars for managed_node2 29946 1726882611.79817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882611.81517: done with get_vars() 29946 1726882611.81542: done getting variables 29946 1726882611.81588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:36:51 -0400 (0:00:00.129) 0:00:37.925 ****** 29946 1726882611.81624: entering _queue_task() for managed_node2/debug 29946 1726882611.82225: worker is 1 (out of 1 available) 29946 1726882611.82233: exiting _queue_task() for managed_node2/debug 29946 1726882611.82243: done queuing things up, now waiting for results queue to drain 29946 1726882611.82244: waiting for pending results... 29946 1726882611.82374: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 29946 1726882611.82421: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a3 29946 1726882611.82444: variable 'ansible_search_path' from source: unknown 29946 1726882611.82453: variable 'ansible_search_path' from source: unknown 29946 1726882611.82507: calling self._execute() 29946 1726882611.82611: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.82622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.82634: variable 'omit' from source: magic vars 29946 1726882611.83059: variable 'ansible_distribution_major_version' from source: facts 29946 1726882611.83130: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882611.83136: variable 'omit' from source: magic vars 29946 1726882611.83164: variable 'omit' from source: magic vars 29946 1726882611.83206: variable 'omit' from source: magic vars 29946 1726882611.83349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882611.83354: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882611.83357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882611.83359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.83391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.83444: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882611.83462: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.83474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.83595: Set connection var ansible_pipelining to False 29946 1726882611.83609: Set connection var ansible_shell_executable to /bin/sh 29946 1726882611.83676: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882611.83679: Set connection var ansible_timeout to 10 29946 1726882611.83682: Set connection var ansible_shell_type to sh 29946 1726882611.83686: Set connection var ansible_connection to ssh 29946 1726882611.83688: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.83690: variable 'ansible_connection' from source: unknown 29946 1726882611.83694: variable 'ansible_module_compression' from source: unknown 29946 1726882611.83696: variable 'ansible_shell_type' from source: unknown 29946 1726882611.83786: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.83789: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.83792: variable 'ansible_pipelining' from source: unknown 29946 1726882611.83795: variable 'ansible_timeout' from source: unknown 29946 1726882611.83798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.83898: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882611.83921: variable 'omit' from source: magic vars 29946 1726882611.83939: starting attempt loop 29946 1726882611.83974: running the handler 29946 1726882611.84194: variable '__network_connections_result' from source: set_fact 29946 1726882611.84260: handler run complete 29946 1726882611.84285: attempt loop complete, returning result 29946 1726882611.84330: _execute() done 29946 1726882611.84332: dumping result to json 29946 1726882611.84334: done dumping result, returning 29946 1726882611.84336: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-95e7-9dfb-0000000000a3] 29946 1726882611.84338: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a3 ok: [managed_node2] => { "__network_connections_result.stderr_lines": [ "" ] } 29946 1726882611.84490: no more pending results, returning what we have 29946 1726882611.84495: results queue empty 29946 1726882611.84496: checking for any_errors_fatal 29946 1726882611.84506: done checking for any_errors_fatal 29946 1726882611.84506: checking for max_fail_percentage 29946 1726882611.84508: done checking for max_fail_percentage 29946 1726882611.84509: checking to see if all hosts have failed and the running result is not ok 29946 1726882611.84510: done checking to see if all hosts have failed 29946 1726882611.84511: getting the remaining hosts for this loop 29946 1726882611.84512: done getting the remaining hosts for this loop 29946 1726882611.84516: getting the next task for host managed_node2 29946 1726882611.84521: done getting next task for host managed_node2 29946 1726882611.84524: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882611.84527: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882611.84598: getting variables 29946 1726882611.84600: in VariableManager get_vars() 29946 1726882611.84634: Calling all_inventory to load vars for managed_node2 29946 1726882611.84636: Calling groups_inventory to load vars for managed_node2 29946 1726882611.84638: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882611.84767: Calling all_plugins_play to load vars for managed_node2 29946 1726882611.84772: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882611.84776: Calling groups_plugins_play to load vars for managed_node2 29946 1726882611.85382: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a3 29946 1726882611.85386: WORKER PROCESS EXITING 29946 1726882611.86309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882611.88007: done with get_vars() 29946 1726882611.88037: done getting variables 29946 1726882611.88095: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:36:51 -0400 (0:00:00.065) 0:00:37.990 ****** 29946 1726882611.88135: entering _queue_task() for managed_node2/debug 29946 1726882611.88722: worker is 1 (out of 1 available) 29946 1726882611.88730: exiting _queue_task() for managed_node2/debug 29946 1726882611.88739: done queuing things up, now waiting for results queue to drain 29946 1726882611.88741: waiting for pending results... 29946 1726882611.88842: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 29946 1726882611.88984: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a4 29946 1726882611.89006: variable 'ansible_search_path' from source: unknown 29946 1726882611.89022: variable 'ansible_search_path' from source: unknown 29946 1726882611.89065: calling self._execute() 29946 1726882611.89213: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.89229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.89252: variable 'omit' from source: magic vars 29946 1726882611.89783: variable 'ansible_distribution_major_version' from source: facts 29946 1726882611.89805: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882611.89843: variable 'omit' from source: magic vars 29946 1726882611.90062: variable 'omit' from source: magic vars 29946 1726882611.90066: variable 'omit' from source: magic vars 29946 1726882611.90082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882611.90127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882611.90153: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882611.90187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.90205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882611.90235: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882611.90243: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.90249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.90353: Set connection var ansible_pipelining to False 29946 1726882611.90363: Set connection var ansible_shell_executable to /bin/sh 29946 1726882611.90371: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882611.90387: Set connection var ansible_timeout to 10 29946 1726882611.90402: Set connection var ansible_shell_type to sh 29946 1726882611.90499: Set connection var ansible_connection to ssh 29946 1726882611.90503: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.90505: variable 'ansible_connection' from source: unknown 29946 1726882611.90508: variable 'ansible_module_compression' from source: unknown 29946 1726882611.90510: variable 'ansible_shell_type' from source: unknown 29946 1726882611.90512: variable 'ansible_shell_executable' from source: unknown 29946 1726882611.90514: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.90516: variable 'ansible_pipelining' from source: unknown 29946 1726882611.90518: variable 'ansible_timeout' from source: unknown 29946 1726882611.90520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.90623: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882611.90681: variable 'omit' from source: magic vars 29946 1726882611.90695: starting attempt loop 29946 1726882611.90720: running the handler 29946 1726882611.90769: variable '__network_connections_result' from source: set_fact 29946 1726882611.90859: variable '__network_connections_result' from source: set_fact 29946 1726882611.90970: handler run complete 29946 1726882611.90999: attempt loop complete, returning result 29946 1726882611.91007: _execute() done 29946 1726882611.91014: dumping result to json 29946 1726882611.91022: done dumping result, returning 29946 1726882611.91044: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-95e7-9dfb-0000000000a4] 29946 1726882611.91145: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a4 29946 1726882611.91217: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a4 29946 1726882611.91220: WORKER PROCESS EXITING ok: [managed_node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 29946 1726882611.91300: no more pending results, returning what we have 29946 1726882611.91303: results queue empty 29946 1726882611.91304: checking for any_errors_fatal 29946 1726882611.91311: done checking for any_errors_fatal 29946 1726882611.91312: checking for max_fail_percentage 29946 1726882611.91314: done checking for max_fail_percentage 29946 1726882611.91315: checking to see if all hosts have failed and the running result is not ok 29946 1726882611.91315: done checking to see if all hosts have failed 29946 1726882611.91316: getting the remaining hosts for this loop 29946 1726882611.91317: done getting the remaining hosts for this loop 29946 1726882611.91321: getting the next task for host managed_node2 29946 1726882611.91328: done getting next task for host managed_node2 29946 1726882611.91331: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882611.91333: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882611.91344: getting variables 29946 1726882611.91346: in VariableManager get_vars() 29946 1726882611.91382: Calling all_inventory to load vars for managed_node2 29946 1726882611.91385: Calling groups_inventory to load vars for managed_node2 29946 1726882611.91387: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882611.91508: Calling all_plugins_play to load vars for managed_node2 29946 1726882611.91514: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882611.91518: Calling groups_plugins_play to load vars for managed_node2 29946 1726882611.93365: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882611.95304: done with get_vars() 29946 1726882611.95335: done getting variables 29946 1726882611.95567: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:36:51 -0400 (0:00:00.074) 0:00:38.067 ****** 29946 1726882611.95808: entering _queue_task() for managed_node2/debug 29946 1726882611.96318: worker is 1 (out of 1 available) 29946 1726882611.96334: exiting _queue_task() for managed_node2/debug 29946 1726882611.96345: done queuing things up, now waiting for results queue to drain 29946 1726882611.96347: waiting for pending results... 29946 1726882611.96707: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 29946 1726882611.96854: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a5 29946 1726882611.96859: variable 'ansible_search_path' from source: unknown 29946 1726882611.96862: variable 'ansible_search_path' from source: unknown 29946 1726882611.96879: calling self._execute() 29946 1726882611.96987: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882611.96991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882611.97006: variable 'omit' from source: magic vars 29946 1726882611.97410: variable 'ansible_distribution_major_version' from source: facts 29946 1726882611.97423: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882611.97541: variable 'network_state' from source: role '' defaults 29946 1726882611.97551: Evaluated conditional (network_state != {}): False 29946 1726882611.97554: when evaluation is False, skipping this task 29946 1726882611.97557: _execute() done 29946 1726882611.97560: dumping result to json 29946 1726882611.97562: done dumping result, returning 29946 1726882611.97575: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-95e7-9dfb-0000000000a5] 29946 1726882611.97578: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a5 29946 1726882611.97677: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a5 29946 1726882611.97680: WORKER PROCESS EXITING skipping: [managed_node2] => { "false_condition": "network_state != {}" } 29946 1726882611.97757: no more pending results, returning what we have 29946 1726882611.97760: results queue empty 29946 1726882611.97761: checking for any_errors_fatal 29946 1726882611.97768: done checking for any_errors_fatal 29946 1726882611.97768: checking for max_fail_percentage 29946 1726882611.97770: done checking for max_fail_percentage 29946 1726882611.97771: checking to see if all hosts have failed and the running result is not ok 29946 1726882611.97772: done checking to see if all hosts have failed 29946 1726882611.97772: getting the remaining hosts for this loop 29946 1726882611.97773: done getting the remaining hosts for this loop 29946 1726882611.97776: getting the next task for host managed_node2 29946 1726882611.97782: done getting next task for host managed_node2 29946 1726882611.97785: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882611.97787: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882611.97802: getting variables 29946 1726882611.97803: in VariableManager get_vars() 29946 1726882611.97910: Calling all_inventory to load vars for managed_node2 29946 1726882611.97913: Calling groups_inventory to load vars for managed_node2 29946 1726882611.97915: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882611.97922: Calling all_plugins_play to load vars for managed_node2 29946 1726882611.97924: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882611.97927: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.00179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.04110: done with get_vars() 29946 1726882612.04135: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:36:52 -0400 (0:00:00.085) 0:00:38.153 ****** 29946 1726882612.04360: entering _queue_task() for managed_node2/ping 29946 1726882612.05111: worker is 1 (out of 1 available) 29946 1726882612.05121: exiting _queue_task() for managed_node2/ping 29946 1726882612.05132: done queuing things up, now waiting for results queue to drain 29946 1726882612.05133: waiting for pending results... 29946 1726882612.05815: running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 29946 1726882612.05832: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a6 29946 1726882612.05847: variable 'ansible_search_path' from source: unknown 29946 1726882612.05851: variable 'ansible_search_path' from source: unknown 29946 1726882612.05887: calling self._execute() 29946 1726882612.06090: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.06102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.06113: variable 'omit' from source: magic vars 29946 1726882612.07001: variable 'ansible_distribution_major_version' from source: facts 29946 1726882612.07005: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882612.07008: variable 'omit' from source: magic vars 29946 1726882612.07039: variable 'omit' from source: magic vars 29946 1726882612.07110: variable 'omit' from source: magic vars 29946 1726882612.07219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882612.07256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882612.07275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882612.07296: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882612.07308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882612.07335: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882612.07339: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.07341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.07762: Set connection var ansible_pipelining to False 29946 1726882612.07765: Set connection var ansible_shell_executable to /bin/sh 29946 1726882612.07767: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882612.07770: Set connection var ansible_timeout to 10 29946 1726882612.07772: Set connection var ansible_shell_type to sh 29946 1726882612.07774: Set connection var ansible_connection to ssh 29946 1726882612.07867: variable 'ansible_shell_executable' from source: unknown 29946 1726882612.07870: variable 'ansible_connection' from source: unknown 29946 1726882612.07873: variable 'ansible_module_compression' from source: unknown 29946 1726882612.07875: variable 'ansible_shell_type' from source: unknown 29946 1726882612.07877: variable 'ansible_shell_executable' from source: unknown 29946 1726882612.07879: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.07881: variable 'ansible_pipelining' from source: unknown 29946 1726882612.07883: variable 'ansible_timeout' from source: unknown 29946 1726882612.07885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.08298: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882612.08303: variable 'omit' from source: magic vars 29946 1726882612.08305: starting attempt loop 29946 1726882612.08307: running the handler 29946 1726882612.08309: _low_level_execute_command(): starting 29946 1726882612.08311: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882612.09720: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.10018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.10321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.11978: stdout chunk (state=3): >>>/root <<< 29946 1726882612.12109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.12115: stdout chunk (state=3): >>><<< 29946 1726882612.12124: stderr chunk (state=3): >>><<< 29946 1726882612.12180: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.12184: _low_level_execute_command(): starting 29946 1726882612.12187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800 `" && echo ansible-tmp-1726882612.1215038-31728-21373683726800="` echo /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800 `" ) && sleep 0' 29946 1726882612.13681: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.13684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882612.13772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.13776: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882612.13791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.14023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.14031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.15808: stdout chunk (state=3): >>>ansible-tmp-1726882612.1215038-31728-21373683726800=/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800 <<< 29946 1726882612.15963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.15968: stderr chunk (state=3): >>><<< 29946 1726882612.15974: stdout chunk (state=3): >>><<< 29946 1726882612.16140: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.1215038-31728-21373683726800=/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.16197: variable 'ansible_module_compression' from source: unknown 29946 1726882612.16496: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 29946 1726882612.16499: variable 'ansible_facts' from source: unknown 29946 1726882612.16674: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py 29946 1726882612.17028: Sending initial data 29946 1726882612.17032: Sent initial data (152 bytes) 29946 1726882612.18229: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882612.18399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.18810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.19029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.20452: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882612.20523: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882612.20598: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpaoixy50l /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py <<< 29946 1726882612.20601: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py" <<< 29946 1726882612.20678: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpaoixy50l" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py" <<< 29946 1726882612.22085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.22122: stderr chunk (state=3): >>><<< 29946 1726882612.22126: stdout chunk (state=3): >>><<< 29946 1726882612.22172: done transferring module to remote 29946 1726882612.22182: _low_level_execute_command(): starting 29946 1726882612.22191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/ /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py && sleep 0' 29946 1726882612.23422: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882612.23601: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.23665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.23756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.25960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.25964: stdout chunk (state=3): >>><<< 29946 1726882612.25971: stderr chunk (state=3): >>><<< 29946 1726882612.25995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.25999: _low_level_execute_command(): starting 29946 1726882612.26002: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/AnsiballZ_ping.py && sleep 0' 29946 1726882612.27112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882612.27402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.27513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.27707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.42315: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 29946 1726882612.43482: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882612.43516: stderr chunk (state=3): >>><<< 29946 1726882612.43525: stdout chunk (state=3): >>><<< 29946 1726882612.43558: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882612.43573: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882612.43582: _low_level_execute_command(): starting 29946 1726882612.43586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.1215038-31728-21373683726800/ > /dev/null 2>&1 && sleep 0' 29946 1726882612.44247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.44300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882612.44303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.44306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.44308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882612.44310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.44366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882612.44372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.44428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.44529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.46431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.46435: stdout chunk (state=3): >>><<< 29946 1726882612.46437: stderr chunk (state=3): >>><<< 29946 1726882612.46513: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.46520: handler run complete 29946 1726882612.46523: attempt loop complete, returning result 29946 1726882612.46526: _execute() done 29946 1726882612.46528: dumping result to json 29946 1726882612.46530: done dumping result, returning 29946 1726882612.46532: done running TaskExecutor() for managed_node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-95e7-9dfb-0000000000a6] 29946 1726882612.46600: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a6 ok: [managed_node2] => { "changed": false, "ping": "pong" } 29946 1726882612.47016: no more pending results, returning what we have 29946 1726882612.47019: results queue empty 29946 1726882612.47020: checking for any_errors_fatal 29946 1726882612.47025: done checking for any_errors_fatal 29946 1726882612.47026: checking for max_fail_percentage 29946 1726882612.47027: done checking for max_fail_percentage 29946 1726882612.47028: checking to see if all hosts have failed and the running result is not ok 29946 1726882612.47029: done checking to see if all hosts have failed 29946 1726882612.47029: getting the remaining hosts for this loop 29946 1726882612.47031: done getting the remaining hosts for this loop 29946 1726882612.47034: getting the next task for host managed_node2 29946 1726882612.47039: done getting next task for host managed_node2 29946 1726882612.47041: ^ task is: TASK: meta (role_complete) 29946 1726882612.47043: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.47053: getting variables 29946 1726882612.47054: in VariableManager get_vars() 29946 1726882612.47091: Calling all_inventory to load vars for managed_node2 29946 1726882612.47096: Calling groups_inventory to load vars for managed_node2 29946 1726882612.47098: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.47108: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.47110: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.47113: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.47710: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a6 29946 1726882612.47713: WORKER PROCESS EXITING 29946 1726882612.48776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.49692: done with get_vars() 29946 1726882612.49711: done getting variables 29946 1726882612.49764: done queuing things up, now waiting for results queue to drain 29946 1726882612.49766: results queue empty 29946 1726882612.49766: checking for any_errors_fatal 29946 1726882612.49768: done checking for any_errors_fatal 29946 1726882612.49768: checking for max_fail_percentage 29946 1726882612.49769: done checking for max_fail_percentage 29946 1726882612.49769: checking to see if all hosts have failed and the running result is not ok 29946 1726882612.49770: done checking to see if all hosts have failed 29946 1726882612.49770: getting the remaining hosts for this loop 29946 1726882612.49771: done getting the remaining hosts for this loop 29946 1726882612.49773: getting the next task for host managed_node2 29946 1726882612.49776: done getting next task for host managed_node2 29946 1726882612.49777: ^ task is: TASK: meta (flush_handlers) 29946 1726882612.49778: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.49780: getting variables 29946 1726882612.49780: in VariableManager get_vars() 29946 1726882612.49791: Calling all_inventory to load vars for managed_node2 29946 1726882612.49792: Calling groups_inventory to load vars for managed_node2 29946 1726882612.49796: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.49801: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.49803: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.49804: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.51350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.53441: done with get_vars() 29946 1726882612.53467: done getting variables 29946 1726882612.53692: in VariableManager get_vars() 29946 1726882612.53714: Calling all_inventory to load vars for managed_node2 29946 1726882612.53717: Calling groups_inventory to load vars for managed_node2 29946 1726882612.53719: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.53724: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.53727: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.53730: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.54884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.56178: done with get_vars() 29946 1726882612.56208: done queuing things up, now waiting for results queue to drain 29946 1726882612.56210: results queue empty 29946 1726882612.56211: checking for any_errors_fatal 29946 1726882612.56212: done checking for any_errors_fatal 29946 1726882612.56213: checking for max_fail_percentage 29946 1726882612.56214: done checking for max_fail_percentage 29946 1726882612.56215: checking to see if all hosts have failed and the running result is not ok 29946 1726882612.56216: done checking to see if all hosts have failed 29946 1726882612.56216: getting the remaining hosts for this loop 29946 1726882612.56217: done getting the remaining hosts for this loop 29946 1726882612.56220: getting the next task for host managed_node2 29946 1726882612.56223: done getting next task for host managed_node2 29946 1726882612.56225: ^ task is: TASK: meta (flush_handlers) 29946 1726882612.56226: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.56229: getting variables 29946 1726882612.56230: in VariableManager get_vars() 29946 1726882612.56241: Calling all_inventory to load vars for managed_node2 29946 1726882612.56243: Calling groups_inventory to load vars for managed_node2 29946 1726882612.56245: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.56250: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.56253: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.56255: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.58336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.59779: done with get_vars() 29946 1726882612.59801: done getting variables 29946 1726882612.59838: in VariableManager get_vars() 29946 1726882612.59847: Calling all_inventory to load vars for managed_node2 29946 1726882612.59849: Calling groups_inventory to load vars for managed_node2 29946 1726882612.59850: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.59853: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.59855: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.59856: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.60524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.61645: done with get_vars() 29946 1726882612.61663: done queuing things up, now waiting for results queue to drain 29946 1726882612.61665: results queue empty 29946 1726882612.61666: checking for any_errors_fatal 29946 1726882612.61667: done checking for any_errors_fatal 29946 1726882612.61667: checking for max_fail_percentage 29946 1726882612.61668: done checking for max_fail_percentage 29946 1726882612.61672: checking to see if all hosts have failed and the running result is not ok 29946 1726882612.61672: done checking to see if all hosts have failed 29946 1726882612.61673: getting the remaining hosts for this loop 29946 1726882612.61674: done getting the remaining hosts for this loop 29946 1726882612.61677: getting the next task for host managed_node2 29946 1726882612.61682: done getting next task for host managed_node2 29946 1726882612.61683: ^ task is: None 29946 1726882612.61684: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.61686: done queuing things up, now waiting for results queue to drain 29946 1726882612.61687: results queue empty 29946 1726882612.61687: checking for any_errors_fatal 29946 1726882612.61688: done checking for any_errors_fatal 29946 1726882612.61688: checking for max_fail_percentage 29946 1726882612.61689: done checking for max_fail_percentage 29946 1726882612.61690: checking to see if all hosts have failed and the running result is not ok 29946 1726882612.61691: done checking to see if all hosts have failed 29946 1726882612.61692: getting the next task for host managed_node2 29946 1726882612.61696: done getting next task for host managed_node2 29946 1726882612.61697: ^ task is: None 29946 1726882612.61698: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.61736: in VariableManager get_vars() 29946 1726882612.61747: done with get_vars() 29946 1726882612.61750: in VariableManager get_vars() 29946 1726882612.61762: done with get_vars() 29946 1726882612.61766: variable 'omit' from source: magic vars 29946 1726882612.61794: in VariableManager get_vars() 29946 1726882612.61803: done with get_vars() 29946 1726882612.61825: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 29946 1726882612.62069: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 29946 1726882612.62099: getting the remaining hosts for this loop 29946 1726882612.62101: done getting the remaining hosts for this loop 29946 1726882612.62103: getting the next task for host managed_node2 29946 1726882612.62105: done getting next task for host managed_node2 29946 1726882612.62107: ^ task is: TASK: Gathering Facts 29946 1726882612.62109: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882612.62111: getting variables 29946 1726882612.62111: in VariableManager get_vars() 29946 1726882612.62119: Calling all_inventory to load vars for managed_node2 29946 1726882612.62122: Calling groups_inventory to load vars for managed_node2 29946 1726882612.62123: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882612.62127: Calling all_plugins_play to load vars for managed_node2 29946 1726882612.62128: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882612.62130: Calling groups_plugins_play to load vars for managed_node2 29946 1726882612.63683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882612.64808: done with get_vars() 29946 1726882612.64825: done getting variables 29946 1726882612.64875: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 Friday 20 September 2024 21:36:52 -0400 (0:00:00.605) 0:00:38.758 ****** 29946 1726882612.64913: entering _queue_task() for managed_node2/gather_facts 29946 1726882612.65245: worker is 1 (out of 1 available) 29946 1726882612.65256: exiting _queue_task() for managed_node2/gather_facts 29946 1726882612.65267: done queuing things up, now waiting for results queue to drain 29946 1726882612.65268: waiting for pending results... 29946 1726882612.65483: running TaskExecutor() for managed_node2/TASK: Gathering Facts 29946 1726882612.65569: in run() - task 12673a56-9f93-95e7-9dfb-00000000066a 29946 1726882612.65630: variable 'ansible_search_path' from source: unknown 29946 1726882612.65641: calling self._execute() 29946 1726882612.65747: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.65759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.65769: variable 'omit' from source: magic vars 29946 1726882612.66126: variable 'ansible_distribution_major_version' from source: facts 29946 1726882612.66140: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882612.66144: variable 'omit' from source: magic vars 29946 1726882612.66161: variable 'omit' from source: magic vars 29946 1726882612.66187: variable 'omit' from source: magic vars 29946 1726882612.66221: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882612.66249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882612.66269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882612.66284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882612.66301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882612.66339: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882612.66342: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.66344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.66430: Set connection var ansible_pipelining to False 29946 1726882612.66433: Set connection var ansible_shell_executable to /bin/sh 29946 1726882612.66439: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882612.66444: Set connection var ansible_timeout to 10 29946 1726882612.66451: Set connection var ansible_shell_type to sh 29946 1726882612.66453: Set connection var ansible_connection to ssh 29946 1726882612.66474: variable 'ansible_shell_executable' from source: unknown 29946 1726882612.66477: variable 'ansible_connection' from source: unknown 29946 1726882612.66480: variable 'ansible_module_compression' from source: unknown 29946 1726882612.66483: variable 'ansible_shell_type' from source: unknown 29946 1726882612.66485: variable 'ansible_shell_executable' from source: unknown 29946 1726882612.66487: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882612.66491: variable 'ansible_pipelining' from source: unknown 29946 1726882612.66497: variable 'ansible_timeout' from source: unknown 29946 1726882612.66500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882612.66636: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882612.66645: variable 'omit' from source: magic vars 29946 1726882612.66650: starting attempt loop 29946 1726882612.66653: running the handler 29946 1726882612.66666: variable 'ansible_facts' from source: unknown 29946 1726882612.66684: _low_level_execute_command(): starting 29946 1726882612.66700: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882612.67419: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.67466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882612.67496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.67525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.67657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.69242: stdout chunk (state=3): >>>/root <<< 29946 1726882612.69401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.69405: stdout chunk (state=3): >>><<< 29946 1726882612.69407: stderr chunk (state=3): >>><<< 29946 1726882612.69529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.69533: _low_level_execute_command(): starting 29946 1726882612.69535: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681 `" && echo ansible-tmp-1726882612.694361-31764-34400233973681="` echo /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681 `" ) && sleep 0' 29946 1726882612.70099: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882612.70204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.70255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882612.70342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882612.70421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.70539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.70760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.70858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.71003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.72857: stdout chunk (state=3): >>>ansible-tmp-1726882612.694361-31764-34400233973681=/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681 <<< 29946 1726882612.73011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.73015: stdout chunk (state=3): >>><<< 29946 1726882612.73018: stderr chunk (state=3): >>><<< 29946 1726882612.73204: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882612.694361-31764-34400233973681=/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.73208: variable 'ansible_module_compression' from source: unknown 29946 1726882612.73210: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 29946 1726882612.73245: variable 'ansible_facts' from source: unknown 29946 1726882612.73478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py 29946 1726882612.73627: Sending initial data 29946 1726882612.73637: Sent initial data (152 bytes) 29946 1726882612.74376: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.74443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.74509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.76521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882612.76525: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882612.76584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmp7xsgou3o /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py <<< 29946 1726882612.76588: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py" <<< 29946 1726882612.76652: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmp7xsgou3o" to remote "/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py" <<< 29946 1726882612.81883: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.81887: stderr chunk (state=3): >>><<< 29946 1726882612.81889: stdout chunk (state=3): >>><<< 29946 1726882612.81891: done transferring module to remote 29946 1726882612.81900: _low_level_execute_command(): starting 29946 1726882612.81903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/ /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py && sleep 0' 29946 1726882612.82673: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882612.82685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.82711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882612.82733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882612.82755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882612.82807: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.82867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882612.82888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.83001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882612.83077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882612.84876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882612.84885: stdout chunk (state=3): >>><<< 29946 1726882612.85119: stderr chunk (state=3): >>><<< 29946 1726882612.85123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882612.85127: _low_level_execute_command(): starting 29946 1726882612.85130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/AnsiballZ_setup.py && sleep 0' 29946 1726882612.86191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882612.86196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.86199: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882612.86201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882612.86262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882612.86265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882612.86589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882613.50171: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 803, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789745152, "block_size": 4096, "block_total": 65519099, "block_available": 63913512, "block_used": 1605587, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3<<< 29946 1726882613.50200: stdout chunk (state=3): >>>xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "53", "epoch": "1726882613", "epoch_int": "1726882613", "date": "2024-09-20", "time": "21:36:53", "iso8601_micro": "2024-09-21T01:36:53.452965Z", "iso8601": "2024-09-21T01:36:53Z", "iso8601_basic": "20240920T213653452965", "iso8601_basic_short": "20240920T213653", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5341796875, "5m": 0.51123046875, "15m": 0.2958984375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4<<< 29946 1726882613.50254: stdout chunk (state=3): >>>": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixe<<< 29946 1726882613.50259: stdout chunk (state=3): >>>d]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 29946 1726882613.52120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882613.52148: stderr chunk (state=3): >>><<< 29946 1726882613.52152: stdout chunk (state=3): >>><<< 29946 1726882613.52187: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-14-69", "ansible_nodename": "ip-10-31-14-69.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec273daf4d79783f5cba36df2f56d9d0", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2959, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 572, "free": 2959}, "nocache": {"free": 3299, "used": 232}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_uuid": "ec273daf-4d79-783f-5cba-36df2f56d9d0", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 803, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261789745152, "block_size": 4096, "block_total": 65519099, "block_available": 63913512, "block_used": 1605587, "inode_total": 131070960, "inode_available": 131029049, "inode_used": 41911, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_local": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDO9PZgr9JLdptbX1z24dINsp1ZUviCn2IFYUqfMM6j/uCKMg5pVfDr5EP5Ea09xR+KKjE9W6h445mjrxTxfVC3xCHR3VpSw3Oq+2ut1Ji+loZ+gygWU601w94ai/xsdgyml1uEyWaA+y3goILZNio8q0yQtVVMKaylDdwXYQ2zefxhpEJ2IlB2HJcJzSxCYz+Sa3mdkfG2DlXy2tqo95KEZ2m7lxzM1pkAHXup+mi3WaH4b4fHxNlRo8S/ebtmXiUYGjymQ5jck8sol0xo4LeBCRe0NKWBJZmK4X6N7Vwrb9tSp9rBJYxjQA9YCszz8i2C3Q33fP+kP2NUonq0NfFciCOt026ERL+ygggM392iXVJPF3VZfX1Pi3Z6B1PbuFZy/UE0SpwxHjWy+QRHd/SVa4YK0V3bMQ3T0bvGI2UuujjRvmDoob7j8Q4QkyY73p60sv4iob7xx/5BBlSagZNKbPiUWhOPXkHgYguuEWrbvoeQUPjhtCzQXguvY0Y6U18=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOkVDo8QW6ai2hAn3+uCY59f9/ff9I0xJwsgAdLmXdfM6LXa2YZqxM/XbCey2xlDC6ejVLDU0902Xq19HWz8n48=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMO17OwTe9G3GI2fp+men+Q6jlxYO58zd3fpAMZ6aHgk", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "36", "second": "53", "epoch": "1726882613", "epoch_int": "1726882613", "date": "2024-09-20", "time": "21:36:53", "iso8601_micro": "2024-09-21T01:36:53.452965Z", "iso8601": "2024-09-21T01:36:53Z", "iso8601_basic": "20240920T213653452965", "iso8601_basic_short": "20240920T213653", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5341796875, "5m": 0.51123046875, "15m": 0.2958984375}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 35334 10.31.14.69 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 35334 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_iscsi_iqn": "", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:c1ff:fe46:633b", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "6e:57:f6:54:9a:30", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.69", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:c1:46:63:3b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.69", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::8ff:c1ff:fe46:633b"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.69", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::8ff:c1ff:fe46:633b"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882613.52473: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882613.52496: _low_level_execute_command(): starting 29946 1726882613.52500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882612.694361-31764-34400233973681/ > /dev/null 2>&1 && sleep 0' 29946 1726882613.52948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882613.52954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882613.52956: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882613.52959: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882613.52961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882613.53012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882613.53016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882613.53079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882613.54848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882613.54872: stderr chunk (state=3): >>><<< 29946 1726882613.54875: stdout chunk (state=3): >>><<< 29946 1726882613.54886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882613.54897: handler run complete 29946 1726882613.54982: variable 'ansible_facts' from source: unknown 29946 1726882613.55052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.55251: variable 'ansible_facts' from source: unknown 29946 1726882613.55311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.55396: attempt loop complete, returning result 29946 1726882613.55399: _execute() done 29946 1726882613.55402: dumping result to json 29946 1726882613.55423: done dumping result, returning 29946 1726882613.55429: done running TaskExecutor() for managed_node2/TASK: Gathering Facts [12673a56-9f93-95e7-9dfb-00000000066a] 29946 1726882613.55434: sending task result for task 12673a56-9f93-95e7-9dfb-00000000066a 29946 1726882613.55772: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000066a 29946 1726882613.55775: WORKER PROCESS EXITING ok: [managed_node2] 29946 1726882613.56070: no more pending results, returning what we have 29946 1726882613.56073: results queue empty 29946 1726882613.56074: checking for any_errors_fatal 29946 1726882613.56075: done checking for any_errors_fatal 29946 1726882613.56076: checking for max_fail_percentage 29946 1726882613.56077: done checking for max_fail_percentage 29946 1726882613.56078: checking to see if all hosts have failed and the running result is not ok 29946 1726882613.56079: done checking to see if all hosts have failed 29946 1726882613.56080: getting the remaining hosts for this loop 29946 1726882613.56081: done getting the remaining hosts for this loop 29946 1726882613.56084: getting the next task for host managed_node2 29946 1726882613.56090: done getting next task for host managed_node2 29946 1726882613.56092: ^ task is: TASK: meta (flush_handlers) 29946 1726882613.56097: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882613.56101: getting variables 29946 1726882613.56102: in VariableManager get_vars() 29946 1726882613.56131: Calling all_inventory to load vars for managed_node2 29946 1726882613.56135: Calling groups_inventory to load vars for managed_node2 29946 1726882613.56139: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.56152: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.56156: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.56160: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.57573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.58704: done with get_vars() 29946 1726882613.58720: done getting variables 29946 1726882613.58784: in VariableManager get_vars() 29946 1726882613.58796: Calling all_inventory to load vars for managed_node2 29946 1726882613.58797: Calling groups_inventory to load vars for managed_node2 29946 1726882613.58799: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.58802: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.58804: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.58805: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.59948: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.62191: done with get_vars() 29946 1726882613.62357: done queuing things up, now waiting for results queue to drain 29946 1726882613.62359: results queue empty 29946 1726882613.62360: checking for any_errors_fatal 29946 1726882613.62363: done checking for any_errors_fatal 29946 1726882613.62364: checking for max_fail_percentage 29946 1726882613.62365: done checking for max_fail_percentage 29946 1726882613.62369: checking to see if all hosts have failed and the running result is not ok 29946 1726882613.62370: done checking to see if all hosts have failed 29946 1726882613.62371: getting the remaining hosts for this loop 29946 1726882613.62372: done getting the remaining hosts for this loop 29946 1726882613.62375: getting the next task for host managed_node2 29946 1726882613.62379: done getting next task for host managed_node2 29946 1726882613.62381: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 29946 1726882613.62383: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882613.62385: getting variables 29946 1726882613.62386: in VariableManager get_vars() 29946 1726882613.62397: Calling all_inventory to load vars for managed_node2 29946 1726882613.62399: Calling groups_inventory to load vars for managed_node2 29946 1726882613.62401: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.62406: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.62408: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.62411: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.64307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.65775: done with get_vars() 29946 1726882613.65800: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:230 Friday 20 September 2024 21:36:53 -0400 (0:00:01.009) 0:00:39.768 ****** 29946 1726882613.65883: entering _queue_task() for managed_node2/include_tasks 29946 1726882613.66321: worker is 1 (out of 1 available) 29946 1726882613.66428: exiting _queue_task() for managed_node2/include_tasks 29946 1726882613.66438: done queuing things up, now waiting for results queue to drain 29946 1726882613.66440: waiting for pending results... 29946 1726882613.66992: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' 29946 1726882613.66999: in run() - task 12673a56-9f93-95e7-9dfb-0000000000a9 29946 1726882613.67197: variable 'ansible_search_path' from source: unknown 29946 1726882613.67206: calling self._execute() 29946 1726882613.67210: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882613.67311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882613.67316: variable 'omit' from source: magic vars 29946 1726882613.68075: variable 'ansible_distribution_major_version' from source: facts 29946 1726882613.68079: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882613.68082: _execute() done 29946 1726882613.68084: dumping result to json 29946 1726882613.68086: done dumping result, returning 29946 1726882613.68089: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_profile_absent.yml' [12673a56-9f93-95e7-9dfb-0000000000a9] 29946 1726882613.68091: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a9 29946 1726882613.68471: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000a9 29946 1726882613.68474: WORKER PROCESS EXITING 29946 1726882613.68528: no more pending results, returning what we have 29946 1726882613.68534: in VariableManager get_vars() 29946 1726882613.68571: Calling all_inventory to load vars for managed_node2 29946 1726882613.68573: Calling groups_inventory to load vars for managed_node2 29946 1726882613.68576: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.68590: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.68595: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.68598: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.71725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.75995: done with get_vars() 29946 1726882613.76015: variable 'ansible_search_path' from source: unknown 29946 1726882613.76032: we have included files to process 29946 1726882613.76034: generating all_blocks data 29946 1726882613.76035: done generating all_blocks data 29946 1726882613.76036: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29946 1726882613.76037: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29946 1726882613.76040: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 29946 1726882613.76545: in VariableManager get_vars() 29946 1726882613.76564: done with get_vars() 29946 1726882613.76786: done processing included file 29946 1726882613.76788: iterating over new_blocks loaded from include file 29946 1726882613.76790: in VariableManager get_vars() 29946 1726882613.76941: done with get_vars() 29946 1726882613.76943: filtering new block on tags 29946 1726882613.76961: done filtering new block on tags 29946 1726882613.76964: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed_node2 29946 1726882613.76969: extending task lists for all hosts with included blocks 29946 1726882613.77064: done extending task lists 29946 1726882613.77065: done processing included files 29946 1726882613.77065: results queue empty 29946 1726882613.77066: checking for any_errors_fatal 29946 1726882613.77068: done checking for any_errors_fatal 29946 1726882613.77069: checking for max_fail_percentage 29946 1726882613.77070: done checking for max_fail_percentage 29946 1726882613.77071: checking to see if all hosts have failed and the running result is not ok 29946 1726882613.77071: done checking to see if all hosts have failed 29946 1726882613.77072: getting the remaining hosts for this loop 29946 1726882613.77074: done getting the remaining hosts for this loop 29946 1726882613.77076: getting the next task for host managed_node2 29946 1726882613.77080: done getting next task for host managed_node2 29946 1726882613.77083: ^ task is: TASK: Include the task 'get_profile_stat.yml' 29946 1726882613.77085: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882613.77087: getting variables 29946 1726882613.77088: in VariableManager get_vars() 29946 1726882613.77099: Calling all_inventory to load vars for managed_node2 29946 1726882613.77102: Calling groups_inventory to load vars for managed_node2 29946 1726882613.77104: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.77109: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.77112: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.77115: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.80149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.83474: done with get_vars() 29946 1726882613.83618: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.178) 0:00:39.946 ****** 29946 1726882613.83695: entering _queue_task() for managed_node2/include_tasks 29946 1726882613.84523: worker is 1 (out of 1 available) 29946 1726882613.84533: exiting _queue_task() for managed_node2/include_tasks 29946 1726882613.84542: done queuing things up, now waiting for results queue to drain 29946 1726882613.84543: waiting for pending results... 29946 1726882613.84718: running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' 29946 1726882613.84855: in run() - task 12673a56-9f93-95e7-9dfb-00000000067b 29946 1726882613.84877: variable 'ansible_search_path' from source: unknown 29946 1726882613.84921: variable 'ansible_search_path' from source: unknown 29946 1726882613.84938: calling self._execute() 29946 1726882613.85048: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882613.85065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882613.85077: variable 'omit' from source: magic vars 29946 1726882613.85465: variable 'ansible_distribution_major_version' from source: facts 29946 1726882613.85499: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882613.85502: _execute() done 29946 1726882613.85505: dumping result to json 29946 1726882613.85575: done dumping result, returning 29946 1726882613.85578: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-95e7-9dfb-00000000067b] 29946 1726882613.85581: sending task result for task 12673a56-9f93-95e7-9dfb-00000000067b 29946 1726882613.85652: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000067b 29946 1726882613.85655: WORKER PROCESS EXITING 29946 1726882613.85706: no more pending results, returning what we have 29946 1726882613.85716: in VariableManager get_vars() 29946 1726882613.85750: Calling all_inventory to load vars for managed_node2 29946 1726882613.85753: Calling groups_inventory to load vars for managed_node2 29946 1726882613.85756: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.85770: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.85773: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.85776: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.89013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.92181: done with get_vars() 29946 1726882613.92203: variable 'ansible_search_path' from source: unknown 29946 1726882613.92205: variable 'ansible_search_path' from source: unknown 29946 1726882613.92245: we have included files to process 29946 1726882613.92246: generating all_blocks data 29946 1726882613.92248: done generating all_blocks data 29946 1726882613.92249: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29946 1726882613.92250: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29946 1726882613.92253: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 29946 1726882613.93935: done processing included file 29946 1726882613.93937: iterating over new_blocks loaded from include file 29946 1726882613.93938: in VariableManager get_vars() 29946 1726882613.93951: done with get_vars() 29946 1726882613.93953: filtering new block on tags 29946 1726882613.94089: done filtering new block on tags 29946 1726882613.94094: in VariableManager get_vars() 29946 1726882613.94107: done with get_vars() 29946 1726882613.94109: filtering new block on tags 29946 1726882613.94128: done filtering new block on tags 29946 1726882613.94130: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node2 29946 1726882613.94135: extending task lists for all hosts with included blocks 29946 1726882613.94458: done extending task lists 29946 1726882613.94459: done processing included files 29946 1726882613.94460: results queue empty 29946 1726882613.94461: checking for any_errors_fatal 29946 1726882613.94464: done checking for any_errors_fatal 29946 1726882613.94465: checking for max_fail_percentage 29946 1726882613.94466: done checking for max_fail_percentage 29946 1726882613.94467: checking to see if all hosts have failed and the running result is not ok 29946 1726882613.94468: done checking to see if all hosts have failed 29946 1726882613.94469: getting the remaining hosts for this loop 29946 1726882613.94470: done getting the remaining hosts for this loop 29946 1726882613.94473: getting the next task for host managed_node2 29946 1726882613.94477: done getting next task for host managed_node2 29946 1726882613.94479: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 29946 1726882613.94482: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882613.94484: getting variables 29946 1726882613.94485: in VariableManager get_vars() 29946 1726882613.94550: Calling all_inventory to load vars for managed_node2 29946 1726882613.94553: Calling groups_inventory to load vars for managed_node2 29946 1726882613.94555: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882613.94561: Calling all_plugins_play to load vars for managed_node2 29946 1726882613.94563: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882613.94566: Calling groups_plugins_play to load vars for managed_node2 29946 1726882613.96186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882613.99130: done with get_vars() 29946 1726882613.99155: done getting variables 29946 1726882613.99318: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:36:53 -0400 (0:00:00.156) 0:00:40.103 ****** 29946 1726882613.99350: entering _queue_task() for managed_node2/set_fact 29946 1726882613.99867: worker is 1 (out of 1 available) 29946 1726882613.99879: exiting _queue_task() for managed_node2/set_fact 29946 1726882613.99890: done queuing things up, now waiting for results queue to drain 29946 1726882613.99891: waiting for pending results... 29946 1726882614.00614: running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag 29946 1726882614.00619: in run() - task 12673a56-9f93-95e7-9dfb-00000000068a 29946 1726882614.00621: variable 'ansible_search_path' from source: unknown 29946 1726882614.00623: variable 'ansible_search_path' from source: unknown 29946 1726882614.00821: calling self._execute() 29946 1726882614.00924: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.01044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.01059: variable 'omit' from source: magic vars 29946 1726882614.01483: variable 'ansible_distribution_major_version' from source: facts 29946 1726882614.01504: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882614.01516: variable 'omit' from source: magic vars 29946 1726882614.01569: variable 'omit' from source: magic vars 29946 1726882614.01622: variable 'omit' from source: magic vars 29946 1726882614.01666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882614.01719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882614.01746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882614.01766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.01781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.01822: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882614.01831: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.01838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.01948: Set connection var ansible_pipelining to False 29946 1726882614.02022: Set connection var ansible_shell_executable to /bin/sh 29946 1726882614.02026: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882614.02028: Set connection var ansible_timeout to 10 29946 1726882614.02031: Set connection var ansible_shell_type to sh 29946 1726882614.02033: Set connection var ansible_connection to ssh 29946 1726882614.02035: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.02037: variable 'ansible_connection' from source: unknown 29946 1726882614.02040: variable 'ansible_module_compression' from source: unknown 29946 1726882614.02042: variable 'ansible_shell_type' from source: unknown 29946 1726882614.02046: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.02055: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.02062: variable 'ansible_pipelining' from source: unknown 29946 1726882614.02068: variable 'ansible_timeout' from source: unknown 29946 1726882614.02076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.02220: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882614.02246: variable 'omit' from source: magic vars 29946 1726882614.02349: starting attempt loop 29946 1726882614.02352: running the handler 29946 1726882614.02355: handler run complete 29946 1726882614.02357: attempt loop complete, returning result 29946 1726882614.02359: _execute() done 29946 1726882614.02361: dumping result to json 29946 1726882614.02363: done dumping result, returning 29946 1726882614.02365: done running TaskExecutor() for managed_node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-95e7-9dfb-00000000068a] 29946 1726882614.02366: sending task result for task 12673a56-9f93-95e7-9dfb-00000000068a 29946 1726882614.02441: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000068a 29946 1726882614.02445: WORKER PROCESS EXITING ok: [managed_node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 29946 1726882614.02511: no more pending results, returning what we have 29946 1726882614.02515: results queue empty 29946 1726882614.02516: checking for any_errors_fatal 29946 1726882614.02518: done checking for any_errors_fatal 29946 1726882614.02519: checking for max_fail_percentage 29946 1726882614.02521: done checking for max_fail_percentage 29946 1726882614.02522: checking to see if all hosts have failed and the running result is not ok 29946 1726882614.02523: done checking to see if all hosts have failed 29946 1726882614.02524: getting the remaining hosts for this loop 29946 1726882614.02525: done getting the remaining hosts for this loop 29946 1726882614.02529: getting the next task for host managed_node2 29946 1726882614.02537: done getting next task for host managed_node2 29946 1726882614.02540: ^ task is: TASK: Stat profile file 29946 1726882614.02545: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882614.02549: getting variables 29946 1726882614.02551: in VariableManager get_vars() 29946 1726882614.02586: Calling all_inventory to load vars for managed_node2 29946 1726882614.02589: Calling groups_inventory to load vars for managed_node2 29946 1726882614.02594: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882614.02607: Calling all_plugins_play to load vars for managed_node2 29946 1726882614.02610: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882614.02613: Calling groups_plugins_play to load vars for managed_node2 29946 1726882614.05207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882614.07176: done with get_vars() 29946 1726882614.07208: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:36:54 -0400 (0:00:00.079) 0:00:40.182 ****** 29946 1726882614.07313: entering _queue_task() for managed_node2/stat 29946 1726882614.07732: worker is 1 (out of 1 available) 29946 1726882614.07744: exiting _queue_task() for managed_node2/stat 29946 1726882614.07757: done queuing things up, now waiting for results queue to drain 29946 1726882614.07759: waiting for pending results... 29946 1726882614.08301: running TaskExecutor() for managed_node2/TASK: Stat profile file 29946 1726882614.08501: in run() - task 12673a56-9f93-95e7-9dfb-00000000068b 29946 1726882614.08505: variable 'ansible_search_path' from source: unknown 29946 1726882614.08507: variable 'ansible_search_path' from source: unknown 29946 1726882614.08510: calling self._execute() 29946 1726882614.08701: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.08720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.08736: variable 'omit' from source: magic vars 29946 1726882614.09611: variable 'ansible_distribution_major_version' from source: facts 29946 1726882614.09615: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882614.09618: variable 'omit' from source: magic vars 29946 1726882614.09647: variable 'omit' from source: magic vars 29946 1726882614.09898: variable 'profile' from source: include params 29946 1726882614.09910: variable 'interface' from source: set_fact 29946 1726882614.10047: variable 'interface' from source: set_fact 29946 1726882614.10050: variable 'omit' from source: magic vars 29946 1726882614.10070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882614.10112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882614.10141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882614.10174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.10190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.10229: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882614.10239: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.10246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.10363: Set connection var ansible_pipelining to False 29946 1726882614.10386: Set connection var ansible_shell_executable to /bin/sh 29946 1726882614.10481: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882614.10489: Set connection var ansible_timeout to 10 29946 1726882614.10492: Set connection var ansible_shell_type to sh 29946 1726882614.10496: Set connection var ansible_connection to ssh 29946 1726882614.10498: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.10500: variable 'ansible_connection' from source: unknown 29946 1726882614.10502: variable 'ansible_module_compression' from source: unknown 29946 1726882614.10504: variable 'ansible_shell_type' from source: unknown 29946 1726882614.10506: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.10508: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.10510: variable 'ansible_pipelining' from source: unknown 29946 1726882614.10512: variable 'ansible_timeout' from source: unknown 29946 1726882614.10514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.10717: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882614.10735: variable 'omit' from source: magic vars 29946 1726882614.10808: starting attempt loop 29946 1726882614.10817: running the handler 29946 1726882614.10820: _low_level_execute_command(): starting 29946 1726882614.10823: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882614.11582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882614.11612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882614.11701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.11736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.11753: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.11777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.11883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.13558: stdout chunk (state=3): >>>/root <<< 29946 1726882614.13734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.13737: stdout chunk (state=3): >>><<< 29946 1726882614.13739: stderr chunk (state=3): >>><<< 29946 1726882614.13745: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.13747: _low_level_execute_command(): starting 29946 1726882614.13750: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371 `" && echo ansible-tmp-1726882614.1370673-31827-70213897375371="` echo /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371 `" ) && sleep 0' 29946 1726882614.15336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.15443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.15484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.15515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.15661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.17524: stdout chunk (state=3): >>>ansible-tmp-1726882614.1370673-31827-70213897375371=/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371 <<< 29946 1726882614.17707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.17711: stdout chunk (state=3): >>><<< 29946 1726882614.17713: stderr chunk (state=3): >>><<< 29946 1726882614.17906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882614.1370673-31827-70213897375371=/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.17910: variable 'ansible_module_compression' from source: unknown 29946 1726882614.18005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29946 1726882614.18055: variable 'ansible_facts' from source: unknown 29946 1726882614.18339: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py 29946 1726882614.18582: Sending initial data 29946 1726882614.18592: Sent initial data (152 bytes) 29946 1726882614.19663: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882614.19729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882614.19768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.19827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.19853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.19997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.21608: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882614.21665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882614.21733: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpgw8f0hee /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py <<< 29946 1726882614.21736: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py" <<< 29946 1726882614.21819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpgw8f0hee" to remote "/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py" <<< 29946 1726882614.23266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.23270: stdout chunk (state=3): >>><<< 29946 1726882614.23277: stderr chunk (state=3): >>><<< 29946 1726882614.23404: done transferring module to remote 29946 1726882614.23407: _low_level_execute_command(): starting 29946 1726882614.23410: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/ /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py && sleep 0' 29946 1726882614.24575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.24812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.24902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.26643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.26761: stdout chunk (state=3): >>><<< 29946 1726882614.26765: stderr chunk (state=3): >>><<< 29946 1726882614.26768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.26775: _low_level_execute_command(): starting 29946 1726882614.26777: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/AnsiballZ_stat.py && sleep 0' 29946 1726882614.27871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.28018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.28033: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.28052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.28138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.43194: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29946 1726882614.44229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882614.44350: stderr chunk (state=3): >>><<< 29946 1726882614.44359: stdout chunk (state=3): >>><<< 29946 1726882614.44384: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882614.44508: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882614.44524: _low_level_execute_command(): starting 29946 1726882614.44546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882614.1370673-31827-70213897375371/ > /dev/null 2>&1 && sleep 0' 29946 1726882614.45818: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882614.45822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882614.45824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.45826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882614.45828: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882614.45830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882614.45832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.45974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.46032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.46137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.47875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.47927: stderr chunk (state=3): >>><<< 29946 1726882614.48111: stdout chunk (state=3): >>><<< 29946 1726882614.48114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.48116: handler run complete 29946 1726882614.48117: attempt loop complete, returning result 29946 1726882614.48122: _execute() done 29946 1726882614.48124: dumping result to json 29946 1726882614.48125: done dumping result, returning 29946 1726882614.48127: done running TaskExecutor() for managed_node2/TASK: Stat profile file [12673a56-9f93-95e7-9dfb-00000000068b] 29946 1726882614.48128: sending task result for task 12673a56-9f93-95e7-9dfb-00000000068b 29946 1726882614.48411: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000068b 29946 1726882614.48415: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 29946 1726882614.48484: no more pending results, returning what we have 29946 1726882614.48488: results queue empty 29946 1726882614.48489: checking for any_errors_fatal 29946 1726882614.48500: done checking for any_errors_fatal 29946 1726882614.48500: checking for max_fail_percentage 29946 1726882614.48502: done checking for max_fail_percentage 29946 1726882614.48503: checking to see if all hosts have failed and the running result is not ok 29946 1726882614.48505: done checking to see if all hosts have failed 29946 1726882614.48505: getting the remaining hosts for this loop 29946 1726882614.48507: done getting the remaining hosts for this loop 29946 1726882614.48512: getting the next task for host managed_node2 29946 1726882614.48520: done getting next task for host managed_node2 29946 1726882614.48522: ^ task is: TASK: Set NM profile exist flag based on the profile files 29946 1726882614.48530: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882614.48535: getting variables 29946 1726882614.48537: in VariableManager get_vars() 29946 1726882614.48573: Calling all_inventory to load vars for managed_node2 29946 1726882614.48576: Calling groups_inventory to load vars for managed_node2 29946 1726882614.48580: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882614.48958: Calling all_plugins_play to load vars for managed_node2 29946 1726882614.48963: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882614.48968: Calling groups_plugins_play to load vars for managed_node2 29946 1726882614.52342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882614.63635: done with get_vars() 29946 1726882614.63659: done getting variables 29946 1726882614.63914: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:36:54 -0400 (0:00:00.566) 0:00:40.748 ****** 29946 1726882614.63940: entering _queue_task() for managed_node2/set_fact 29946 1726882614.64485: worker is 1 (out of 1 available) 29946 1726882614.64579: exiting _queue_task() for managed_node2/set_fact 29946 1726882614.64622: done queuing things up, now waiting for results queue to drain 29946 1726882614.64625: waiting for pending results... 29946 1726882614.64938: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files 29946 1726882614.65403: in run() - task 12673a56-9f93-95e7-9dfb-00000000068c 29946 1726882614.65407: variable 'ansible_search_path' from source: unknown 29946 1726882614.65410: variable 'ansible_search_path' from source: unknown 29946 1726882614.65416: calling self._execute() 29946 1726882614.65913: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.65917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.65919: variable 'omit' from source: magic vars 29946 1726882614.66760: variable 'ansible_distribution_major_version' from source: facts 29946 1726882614.66781: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882614.67076: variable 'profile_stat' from source: set_fact 29946 1726882614.67130: Evaluated conditional (profile_stat.stat.exists): False 29946 1726882614.67226: when evaluation is False, skipping this task 29946 1726882614.67229: _execute() done 29946 1726882614.67232: dumping result to json 29946 1726882614.67235: done dumping result, returning 29946 1726882614.67288: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-95e7-9dfb-00000000068c] 29946 1726882614.67297: sending task result for task 12673a56-9f93-95e7-9dfb-00000000068c skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29946 1726882614.67565: no more pending results, returning what we have 29946 1726882614.67569: results queue empty 29946 1726882614.67570: checking for any_errors_fatal 29946 1726882614.67584: done checking for any_errors_fatal 29946 1726882614.67584: checking for max_fail_percentage 29946 1726882614.67586: done checking for max_fail_percentage 29946 1726882614.67587: checking to see if all hosts have failed and the running result is not ok 29946 1726882614.67588: done checking to see if all hosts have failed 29946 1726882614.67588: getting the remaining hosts for this loop 29946 1726882614.67590: done getting the remaining hosts for this loop 29946 1726882614.67595: getting the next task for host managed_node2 29946 1726882614.67603: done getting next task for host managed_node2 29946 1726882614.67606: ^ task is: TASK: Get NM profile info 29946 1726882614.67612: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882614.67617: getting variables 29946 1726882614.67619: in VariableManager get_vars() 29946 1726882614.67653: Calling all_inventory to load vars for managed_node2 29946 1726882614.67656: Calling groups_inventory to load vars for managed_node2 29946 1726882614.67659: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882614.67674: Calling all_plugins_play to load vars for managed_node2 29946 1726882614.67677: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882614.67680: Calling groups_plugins_play to load vars for managed_node2 29946 1726882614.68230: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000068c 29946 1726882614.68234: WORKER PROCESS EXITING 29946 1726882614.69636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882614.72259: done with get_vars() 29946 1726882614.72287: done getting variables 29946 1726882614.72399: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:36:54 -0400 (0:00:00.084) 0:00:40.833 ****** 29946 1726882614.72601: entering _queue_task() for managed_node2/shell 29946 1726882614.72604: Creating lock for shell 29946 1726882614.72992: worker is 1 (out of 1 available) 29946 1726882614.73007: exiting _queue_task() for managed_node2/shell 29946 1726882614.73018: done queuing things up, now waiting for results queue to drain 29946 1726882614.73020: waiting for pending results... 29946 1726882614.73374: running TaskExecutor() for managed_node2/TASK: Get NM profile info 29946 1726882614.73455: in run() - task 12673a56-9f93-95e7-9dfb-00000000068d 29946 1726882614.73487: variable 'ansible_search_path' from source: unknown 29946 1726882614.73502: variable 'ansible_search_path' from source: unknown 29946 1726882614.73542: calling self._execute() 29946 1726882614.73839: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.73843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.73846: variable 'omit' from source: magic vars 29946 1726882614.74268: variable 'ansible_distribution_major_version' from source: facts 29946 1726882614.74285: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882614.74302: variable 'omit' from source: magic vars 29946 1726882614.74361: variable 'omit' from source: magic vars 29946 1726882614.74477: variable 'profile' from source: include params 29946 1726882614.74486: variable 'interface' from source: set_fact 29946 1726882614.74568: variable 'interface' from source: set_fact 29946 1726882614.74596: variable 'omit' from source: magic vars 29946 1726882614.74639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882614.74785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882614.74792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882614.74797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.74799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882614.74802: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882614.74804: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.74807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.74921: Set connection var ansible_pipelining to False 29946 1726882614.74933: Set connection var ansible_shell_executable to /bin/sh 29946 1726882614.74944: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882614.74956: Set connection var ansible_timeout to 10 29946 1726882614.74968: Set connection var ansible_shell_type to sh 29946 1726882614.74976: Set connection var ansible_connection to ssh 29946 1726882614.75017: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.75026: variable 'ansible_connection' from source: unknown 29946 1726882614.75034: variable 'ansible_module_compression' from source: unknown 29946 1726882614.75041: variable 'ansible_shell_type' from source: unknown 29946 1726882614.75048: variable 'ansible_shell_executable' from source: unknown 29946 1726882614.75055: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882614.75062: variable 'ansible_pipelining' from source: unknown 29946 1726882614.75069: variable 'ansible_timeout' from source: unknown 29946 1726882614.75076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882614.75252: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882614.75330: variable 'omit' from source: magic vars 29946 1726882614.75333: starting attempt loop 29946 1726882614.75335: running the handler 29946 1726882614.75338: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882614.75340: _low_level_execute_command(): starting 29946 1726882614.75342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882614.76212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.76231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.76331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.77975: stdout chunk (state=3): >>>/root <<< 29946 1726882614.78110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.78130: stdout chunk (state=3): >>><<< 29946 1726882614.78144: stderr chunk (state=3): >>><<< 29946 1726882614.78168: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.78188: _low_level_execute_command(): starting 29946 1726882614.78273: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460 `" && echo ansible-tmp-1726882614.7817528-31865-154801182256460="` echo /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460 `" ) && sleep 0' 29946 1726882614.78833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882614.78846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882614.78862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882614.78900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882614.78955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.79018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.79035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.79072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.79165: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.81040: stdout chunk (state=3): >>>ansible-tmp-1726882614.7817528-31865-154801182256460=/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460 <<< 29946 1726882614.81198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.81220: stdout chunk (state=3): >>><<< 29946 1726882614.81223: stderr chunk (state=3): >>><<< 29946 1726882614.81313: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882614.7817528-31865-154801182256460=/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.81317: variable 'ansible_module_compression' from source: unknown 29946 1726882614.81335: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882614.81379: variable 'ansible_facts' from source: unknown 29946 1726882614.81478: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py 29946 1726882614.81657: Sending initial data 29946 1726882614.81667: Sent initial data (156 bytes) 29946 1726882614.82325: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882614.82412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.82458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.82480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.82508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.82592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.84146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882614.84197: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882614.84261: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpq6m62_na /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py <<< 29946 1726882614.84265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py" <<< 29946 1726882614.84323: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpq6m62_na" to remote "/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py" <<< 29946 1726882614.84956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.84990: stderr chunk (state=3): >>><<< 29946 1726882614.84996: stdout chunk (state=3): >>><<< 29946 1726882614.85014: done transferring module to remote 29946 1726882614.85022: _low_level_execute_command(): starting 29946 1726882614.85027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/ /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py && sleep 0' 29946 1726882614.85445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882614.85448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882614.85451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882614.85453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882614.85459: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.85510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882614.85516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.85576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882614.87631: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882614.87635: stdout chunk (state=3): >>><<< 29946 1726882614.87638: stderr chunk (state=3): >>><<< 29946 1726882614.87641: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882614.87643: _low_level_execute_command(): starting 29946 1726882614.87646: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/AnsiballZ_command.py && sleep 0' 29946 1726882614.87996: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882614.87999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.88002: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882614.88004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882614.88049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882614.88053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882614.88129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.05712: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:55.038774", "end": "2024-09-20 21:36:55.056081", "delta": "0:00:00.017307", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882615.07505: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. <<< 29946 1726882615.07509: stdout chunk (state=3): >>><<< 29946 1726882615.07511: stderr chunk (state=3): >>><<< 29946 1726882615.07514: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-20 21:36:55.038774", "end": "2024-09-20 21:36:55.056081", "delta": "0:00:00.017307", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.69 closed. 29946 1726882615.07517: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882615.07519: _low_level_execute_command(): starting 29946 1726882615.07521: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882614.7817528-31865-154801182256460/ > /dev/null 2>&1 && sleep 0' 29946 1726882615.08076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882615.08084: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.08096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882615.08117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.08130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882615.08201: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882615.08205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.08208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882615.08211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address <<< 29946 1726882615.08214: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 29946 1726882615.08216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.08219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882615.08221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.08224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882615.08226: stderr chunk (state=3): >>>debug2: match found <<< 29946 1726882615.08228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.08303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.08307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.08331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.08427: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.10340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.10367: stderr chunk (state=3): >>><<< 29946 1726882615.10374: stdout chunk (state=3): >>><<< 29946 1726882615.10396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882615.10409: handler run complete 29946 1726882615.10436: Evaluated conditional (False): False 29946 1726882615.10448: attempt loop complete, returning result 29946 1726882615.10453: _execute() done 29946 1726882615.10458: dumping result to json 29946 1726882615.10465: done dumping result, returning 29946 1726882615.10475: done running TaskExecutor() for managed_node2/TASK: Get NM profile info [12673a56-9f93-95e7-9dfb-00000000068d] 29946 1726882615.10482: sending task result for task 12673a56-9f93-95e7-9dfb-00000000068d fatal: [managed_node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.017307", "end": "2024-09-20 21:36:55.056081", "rc": 1, "start": "2024-09-20 21:36:55.038774" } MSG: non-zero return code ...ignoring 29946 1726882615.10691: no more pending results, returning what we have 29946 1726882615.10696: results queue empty 29946 1726882615.10697: checking for any_errors_fatal 29946 1726882615.10706: done checking for any_errors_fatal 29946 1726882615.10707: checking for max_fail_percentage 29946 1726882615.10708: done checking for max_fail_percentage 29946 1726882615.10709: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.10710: done checking to see if all hosts have failed 29946 1726882615.10711: getting the remaining hosts for this loop 29946 1726882615.10712: done getting the remaining hosts for this loop 29946 1726882615.10715: getting the next task for host managed_node2 29946 1726882615.10723: done getting next task for host managed_node2 29946 1726882615.10725: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 29946 1726882615.10728: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.10732: getting variables 29946 1726882615.10733: in VariableManager get_vars() 29946 1726882615.10762: Calling all_inventory to load vars for managed_node2 29946 1726882615.10764: Calling groups_inventory to load vars for managed_node2 29946 1726882615.10767: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.10778: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.10780: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.10783: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.11309: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000068d 29946 1726882615.11312: WORKER PROCESS EXITING 29946 1726882615.12044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.13230: done with get_vars() 29946 1726882615.13254: done getting variables 29946 1726882615.13313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:36:55 -0400 (0:00:00.409) 0:00:41.243 ****** 29946 1726882615.13346: entering _queue_task() for managed_node2/set_fact 29946 1726882615.13587: worker is 1 (out of 1 available) 29946 1726882615.13598: exiting _queue_task() for managed_node2/set_fact 29946 1726882615.13610: done queuing things up, now waiting for results queue to drain 29946 1726882615.13612: waiting for pending results... 29946 1726882615.13785: running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 29946 1726882615.13878: in run() - task 12673a56-9f93-95e7-9dfb-00000000068e 29946 1726882615.13891: variable 'ansible_search_path' from source: unknown 29946 1726882615.13896: variable 'ansible_search_path' from source: unknown 29946 1726882615.13925: calling self._execute() 29946 1726882615.14009: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.14015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.14030: variable 'omit' from source: magic vars 29946 1726882615.14316: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.14330: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.14424: variable 'nm_profile_exists' from source: set_fact 29946 1726882615.14435: Evaluated conditional (nm_profile_exists.rc == 0): False 29946 1726882615.14438: when evaluation is False, skipping this task 29946 1726882615.14441: _execute() done 29946 1726882615.14443: dumping result to json 29946 1726882615.14446: done dumping result, returning 29946 1726882615.14452: done running TaskExecutor() for managed_node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-95e7-9dfb-00000000068e] 29946 1726882615.14457: sending task result for task 12673a56-9f93-95e7-9dfb-00000000068e 29946 1726882615.14543: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000068e 29946 1726882615.14545: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 29946 1726882615.14588: no more pending results, returning what we have 29946 1726882615.14591: results queue empty 29946 1726882615.14592: checking for any_errors_fatal 29946 1726882615.14603: done checking for any_errors_fatal 29946 1726882615.14604: checking for max_fail_percentage 29946 1726882615.14606: done checking for max_fail_percentage 29946 1726882615.14607: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.14608: done checking to see if all hosts have failed 29946 1726882615.14608: getting the remaining hosts for this loop 29946 1726882615.14610: done getting the remaining hosts for this loop 29946 1726882615.14613: getting the next task for host managed_node2 29946 1726882615.14621: done getting next task for host managed_node2 29946 1726882615.14623: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 29946 1726882615.14627: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.14630: getting variables 29946 1726882615.14632: in VariableManager get_vars() 29946 1726882615.14662: Calling all_inventory to load vars for managed_node2 29946 1726882615.14664: Calling groups_inventory to load vars for managed_node2 29946 1726882615.14667: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.14677: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.14679: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.14681: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.15585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.16605: done with get_vars() 29946 1726882615.16628: done getting variables 29946 1726882615.16681: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.16796: variable 'profile' from source: include params 29946 1726882615.16800: variable 'interface' from source: set_fact 29946 1726882615.16863: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:36:55 -0400 (0:00:00.035) 0:00:41.278 ****** 29946 1726882615.16897: entering _queue_task() for managed_node2/command 29946 1726882615.17096: worker is 1 (out of 1 available) 29946 1726882615.17107: exiting _queue_task() for managed_node2/command 29946 1726882615.17118: done queuing things up, now waiting for results queue to drain 29946 1726882615.17119: waiting for pending results... 29946 1726882615.17278: running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 29946 1726882615.17368: in run() - task 12673a56-9f93-95e7-9dfb-000000000690 29946 1726882615.17383: variable 'ansible_search_path' from source: unknown 29946 1726882615.17386: variable 'ansible_search_path' from source: unknown 29946 1726882615.17419: calling self._execute() 29946 1726882615.17497: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.17501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.17511: variable 'omit' from source: magic vars 29946 1726882615.17903: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.17907: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.17913: variable 'profile_stat' from source: set_fact 29946 1726882615.17926: Evaluated conditional (profile_stat.stat.exists): False 29946 1726882615.17929: when evaluation is False, skipping this task 29946 1726882615.17931: _execute() done 29946 1726882615.17933: dumping result to json 29946 1726882615.17936: done dumping result, returning 29946 1726882615.17943: done running TaskExecutor() for managed_node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-95e7-9dfb-000000000690] 29946 1726882615.17947: sending task result for task 12673a56-9f93-95e7-9dfb-000000000690 29946 1726882615.18039: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000690 29946 1726882615.18042: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29946 1726882615.18086: no more pending results, returning what we have 29946 1726882615.18092: results queue empty 29946 1726882615.18094: checking for any_errors_fatal 29946 1726882615.18103: done checking for any_errors_fatal 29946 1726882615.18103: checking for max_fail_percentage 29946 1726882615.18104: done checking for max_fail_percentage 29946 1726882615.18105: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.18106: done checking to see if all hosts have failed 29946 1726882615.18107: getting the remaining hosts for this loop 29946 1726882615.18108: done getting the remaining hosts for this loop 29946 1726882615.18115: getting the next task for host managed_node2 29946 1726882615.18121: done getting next task for host managed_node2 29946 1726882615.18123: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 29946 1726882615.18126: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.18129: getting variables 29946 1726882615.18130: in VariableManager get_vars() 29946 1726882615.18153: Calling all_inventory to load vars for managed_node2 29946 1726882615.18155: Calling groups_inventory to load vars for managed_node2 29946 1726882615.18158: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.18168: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.18170: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.18173: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.19521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.20579: done with get_vars() 29946 1726882615.20598: done getting variables 29946 1726882615.20637: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.20711: variable 'profile' from source: include params 29946 1726882615.20714: variable 'interface' from source: set_fact 29946 1726882615.20751: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:36:55 -0400 (0:00:00.038) 0:00:41.317 ****** 29946 1726882615.20771: entering _queue_task() for managed_node2/set_fact 29946 1726882615.20981: worker is 1 (out of 1 available) 29946 1726882615.20997: exiting _queue_task() for managed_node2/set_fact 29946 1726882615.21011: done queuing things up, now waiting for results queue to drain 29946 1726882615.21012: waiting for pending results... 29946 1726882615.21167: running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 29946 1726882615.21249: in run() - task 12673a56-9f93-95e7-9dfb-000000000691 29946 1726882615.21261: variable 'ansible_search_path' from source: unknown 29946 1726882615.21265: variable 'ansible_search_path' from source: unknown 29946 1726882615.21295: calling self._execute() 29946 1726882615.21377: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.21382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.21394: variable 'omit' from source: magic vars 29946 1726882615.21653: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.21663: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.21745: variable 'profile_stat' from source: set_fact 29946 1726882615.21755: Evaluated conditional (profile_stat.stat.exists): False 29946 1726882615.21758: when evaluation is False, skipping this task 29946 1726882615.21761: _execute() done 29946 1726882615.21764: dumping result to json 29946 1726882615.21767: done dumping result, returning 29946 1726882615.21770: done running TaskExecutor() for managed_node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12673a56-9f93-95e7-9dfb-000000000691] 29946 1726882615.21778: sending task result for task 12673a56-9f93-95e7-9dfb-000000000691 29946 1726882615.21858: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000691 29946 1726882615.21861: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29946 1726882615.21928: no more pending results, returning what we have 29946 1726882615.21931: results queue empty 29946 1726882615.21932: checking for any_errors_fatal 29946 1726882615.21937: done checking for any_errors_fatal 29946 1726882615.21937: checking for max_fail_percentage 29946 1726882615.21939: done checking for max_fail_percentage 29946 1726882615.21940: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.21940: done checking to see if all hosts have failed 29946 1726882615.21941: getting the remaining hosts for this loop 29946 1726882615.21942: done getting the remaining hosts for this loop 29946 1726882615.21945: getting the next task for host managed_node2 29946 1726882615.21950: done getting next task for host managed_node2 29946 1726882615.21952: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 29946 1726882615.21956: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.21959: getting variables 29946 1726882615.21960: in VariableManager get_vars() 29946 1726882615.21987: Calling all_inventory to load vars for managed_node2 29946 1726882615.21991: Calling groups_inventory to load vars for managed_node2 29946 1726882615.21995: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.22003: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.22005: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.22008: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.22762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.24028: done with get_vars() 29946 1726882615.24047: done getting variables 29946 1726882615.24102: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.24196: variable 'profile' from source: include params 29946 1726882615.24199: variable 'interface' from source: set_fact 29946 1726882615.24249: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:36:55 -0400 (0:00:00.035) 0:00:41.352 ****** 29946 1726882615.24277: entering _queue_task() for managed_node2/command 29946 1726882615.24540: worker is 1 (out of 1 available) 29946 1726882615.24550: exiting _queue_task() for managed_node2/command 29946 1726882615.24561: done queuing things up, now waiting for results queue to drain 29946 1726882615.24562: waiting for pending results... 29946 1726882615.25009: running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 29946 1726882615.25013: in run() - task 12673a56-9f93-95e7-9dfb-000000000692 29946 1726882615.25016: variable 'ansible_search_path' from source: unknown 29946 1726882615.25018: variable 'ansible_search_path' from source: unknown 29946 1726882615.25020: calling self._execute() 29946 1726882615.25081: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.25096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.25108: variable 'omit' from source: magic vars 29946 1726882615.25398: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.25408: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.25485: variable 'profile_stat' from source: set_fact 29946 1726882615.25500: Evaluated conditional (profile_stat.stat.exists): False 29946 1726882615.25503: when evaluation is False, skipping this task 29946 1726882615.25506: _execute() done 29946 1726882615.25509: dumping result to json 29946 1726882615.25511: done dumping result, returning 29946 1726882615.25515: done running TaskExecutor() for managed_node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-95e7-9dfb-000000000692] 29946 1726882615.25520: sending task result for task 12673a56-9f93-95e7-9dfb-000000000692 29946 1726882615.25600: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000692 29946 1726882615.25603: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29946 1726882615.25649: no more pending results, returning what we have 29946 1726882615.25652: results queue empty 29946 1726882615.25653: checking for any_errors_fatal 29946 1726882615.25663: done checking for any_errors_fatal 29946 1726882615.25663: checking for max_fail_percentage 29946 1726882615.25665: done checking for max_fail_percentage 29946 1726882615.25666: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.25667: done checking to see if all hosts have failed 29946 1726882615.25668: getting the remaining hosts for this loop 29946 1726882615.25669: done getting the remaining hosts for this loop 29946 1726882615.25672: getting the next task for host managed_node2 29946 1726882615.25678: done getting next task for host managed_node2 29946 1726882615.25680: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 29946 1726882615.25683: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.25687: getting variables 29946 1726882615.25688: in VariableManager get_vars() 29946 1726882615.25717: Calling all_inventory to load vars for managed_node2 29946 1726882615.25719: Calling groups_inventory to load vars for managed_node2 29946 1726882615.25722: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.25732: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.25734: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.25737: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.26489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.27362: done with get_vars() 29946 1726882615.27377: done getting variables 29946 1726882615.27418: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.27486: variable 'profile' from source: include params 29946 1726882615.27489: variable 'interface' from source: set_fact 29946 1726882615.27527: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:36:55 -0400 (0:00:00.032) 0:00:41.385 ****** 29946 1726882615.27549: entering _queue_task() for managed_node2/set_fact 29946 1726882615.27740: worker is 1 (out of 1 available) 29946 1726882615.27752: exiting _queue_task() for managed_node2/set_fact 29946 1726882615.27763: done queuing things up, now waiting for results queue to drain 29946 1726882615.27764: waiting for pending results... 29946 1726882615.27917: running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 29946 1726882615.28000: in run() - task 12673a56-9f93-95e7-9dfb-000000000693 29946 1726882615.28011: variable 'ansible_search_path' from source: unknown 29946 1726882615.28015: variable 'ansible_search_path' from source: unknown 29946 1726882615.28041: calling self._execute() 29946 1726882615.28115: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.28118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.28127: variable 'omit' from source: magic vars 29946 1726882615.28372: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.28382: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.28468: variable 'profile_stat' from source: set_fact 29946 1726882615.28478: Evaluated conditional (profile_stat.stat.exists): False 29946 1726882615.28481: when evaluation is False, skipping this task 29946 1726882615.28483: _execute() done 29946 1726882615.28486: dumping result to json 29946 1726882615.28488: done dumping result, returning 29946 1726882615.28499: done running TaskExecutor() for managed_node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12673a56-9f93-95e7-9dfb-000000000693] 29946 1726882615.28504: sending task result for task 12673a56-9f93-95e7-9dfb-000000000693 29946 1726882615.28583: done sending task result for task 12673a56-9f93-95e7-9dfb-000000000693 29946 1726882615.28587: WORKER PROCESS EXITING skipping: [managed_node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 29946 1726882615.28631: no more pending results, returning what we have 29946 1726882615.28634: results queue empty 29946 1726882615.28635: checking for any_errors_fatal 29946 1726882615.28640: done checking for any_errors_fatal 29946 1726882615.28641: checking for max_fail_percentage 29946 1726882615.28642: done checking for max_fail_percentage 29946 1726882615.28643: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.28644: done checking to see if all hosts have failed 29946 1726882615.28644: getting the remaining hosts for this loop 29946 1726882615.28645: done getting the remaining hosts for this loop 29946 1726882615.28648: getting the next task for host managed_node2 29946 1726882615.28656: done getting next task for host managed_node2 29946 1726882615.28658: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 29946 1726882615.28661: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.28664: getting variables 29946 1726882615.28665: in VariableManager get_vars() 29946 1726882615.28688: Calling all_inventory to load vars for managed_node2 29946 1726882615.28690: Calling groups_inventory to load vars for managed_node2 29946 1726882615.28694: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.28703: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.28705: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.28708: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.29567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.30428: done with get_vars() 29946 1726882615.30442: done getting variables 29946 1726882615.30481: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.30552: variable 'profile' from source: include params 29946 1726882615.30555: variable 'interface' from source: set_fact 29946 1726882615.30594: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Friday 20 September 2024 21:36:55 -0400 (0:00:00.030) 0:00:41.415 ****** 29946 1726882615.30614: entering _queue_task() for managed_node2/assert 29946 1726882615.30799: worker is 1 (out of 1 available) 29946 1726882615.30811: exiting _queue_task() for managed_node2/assert 29946 1726882615.30822: done queuing things up, now waiting for results queue to drain 29946 1726882615.30823: waiting for pending results... 29946 1726882615.30972: running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' 29946 1726882615.31040: in run() - task 12673a56-9f93-95e7-9dfb-00000000067c 29946 1726882615.31051: variable 'ansible_search_path' from source: unknown 29946 1726882615.31055: variable 'ansible_search_path' from source: unknown 29946 1726882615.31080: calling self._execute() 29946 1726882615.31150: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.31155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.31163: variable 'omit' from source: magic vars 29946 1726882615.31413: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.31422: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.31427: variable 'omit' from source: magic vars 29946 1726882615.31453: variable 'omit' from source: magic vars 29946 1726882615.31525: variable 'profile' from source: include params 29946 1726882615.31529: variable 'interface' from source: set_fact 29946 1726882615.31571: variable 'interface' from source: set_fact 29946 1726882615.31585: variable 'omit' from source: magic vars 29946 1726882615.31619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882615.31645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882615.31659: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882615.31672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.31682: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.31709: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882615.31712: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.31715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.31783: Set connection var ansible_pipelining to False 29946 1726882615.31787: Set connection var ansible_shell_executable to /bin/sh 29946 1726882615.31807: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882615.31812: Set connection var ansible_timeout to 10 29946 1726882615.31815: Set connection var ansible_shell_type to sh 29946 1726882615.31817: Set connection var ansible_connection to ssh 29946 1726882615.31833: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.31836: variable 'ansible_connection' from source: unknown 29946 1726882615.31838: variable 'ansible_module_compression' from source: unknown 29946 1726882615.31840: variable 'ansible_shell_type' from source: unknown 29946 1726882615.31842: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.31844: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.31849: variable 'ansible_pipelining' from source: unknown 29946 1726882615.31851: variable 'ansible_timeout' from source: unknown 29946 1726882615.31855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.31952: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882615.31960: variable 'omit' from source: magic vars 29946 1726882615.31965: starting attempt loop 29946 1726882615.31968: running the handler 29946 1726882615.32055: variable 'lsr_net_profile_exists' from source: set_fact 29946 1726882615.32058: Evaluated conditional (not lsr_net_profile_exists): True 29946 1726882615.32064: handler run complete 29946 1726882615.32074: attempt loop complete, returning result 29946 1726882615.32076: _execute() done 29946 1726882615.32079: dumping result to json 29946 1726882615.32081: done dumping result, returning 29946 1726882615.32087: done running TaskExecutor() for managed_node2/TASK: Assert that the profile is absent - 'ethtest0' [12673a56-9f93-95e7-9dfb-00000000067c] 29946 1726882615.32095: sending task result for task 12673a56-9f93-95e7-9dfb-00000000067c 29946 1726882615.32172: done sending task result for task 12673a56-9f93-95e7-9dfb-00000000067c 29946 1726882615.32175: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882615.32220: no more pending results, returning what we have 29946 1726882615.32223: results queue empty 29946 1726882615.32223: checking for any_errors_fatal 29946 1726882615.32231: done checking for any_errors_fatal 29946 1726882615.32231: checking for max_fail_percentage 29946 1726882615.32233: done checking for max_fail_percentage 29946 1726882615.32234: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.32234: done checking to see if all hosts have failed 29946 1726882615.32235: getting the remaining hosts for this loop 29946 1726882615.32236: done getting the remaining hosts for this loop 29946 1726882615.32239: getting the next task for host managed_node2 29946 1726882615.32246: done getting next task for host managed_node2 29946 1726882615.32248: ^ task is: TASK: Include the task 'assert_device_absent.yml' 29946 1726882615.32250: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.32254: getting variables 29946 1726882615.32255: in VariableManager get_vars() 29946 1726882615.32278: Calling all_inventory to load vars for managed_node2 29946 1726882615.32280: Calling groups_inventory to load vars for managed_node2 29946 1726882615.32285: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.32296: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.32299: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.32302: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.33059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.33938: done with get_vars() 29946 1726882615.33952: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:234 Friday 20 September 2024 21:36:55 -0400 (0:00:00.033) 0:00:41.449 ****** 29946 1726882615.34013: entering _queue_task() for managed_node2/include_tasks 29946 1726882615.34196: worker is 1 (out of 1 available) 29946 1726882615.34208: exiting _queue_task() for managed_node2/include_tasks 29946 1726882615.34218: done queuing things up, now waiting for results queue to drain 29946 1726882615.34219: waiting for pending results... 29946 1726882615.34375: running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' 29946 1726882615.34451: in run() - task 12673a56-9f93-95e7-9dfb-0000000000aa 29946 1726882615.34462: variable 'ansible_search_path' from source: unknown 29946 1726882615.34486: calling self._execute() 29946 1726882615.34561: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.34567: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.34575: variable 'omit' from source: magic vars 29946 1726882615.34834: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.34843: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.34849: _execute() done 29946 1726882615.34852: dumping result to json 29946 1726882615.34854: done dumping result, returning 29946 1726882615.34862: done running TaskExecutor() for managed_node2/TASK: Include the task 'assert_device_absent.yml' [12673a56-9f93-95e7-9dfb-0000000000aa] 29946 1726882615.34864: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000aa 29946 1726882615.34946: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000aa 29946 1726882615.34948: WORKER PROCESS EXITING 29946 1726882615.35003: no more pending results, returning what we have 29946 1726882615.35007: in VariableManager get_vars() 29946 1726882615.35032: Calling all_inventory to load vars for managed_node2 29946 1726882615.35035: Calling groups_inventory to load vars for managed_node2 29946 1726882615.35037: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.35045: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.35047: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.35050: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.35901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.36745: done with get_vars() 29946 1726882615.36757: variable 'ansible_search_path' from source: unknown 29946 1726882615.36766: we have included files to process 29946 1726882615.36767: generating all_blocks data 29946 1726882615.36768: done generating all_blocks data 29946 1726882615.36772: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29946 1726882615.36772: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29946 1726882615.36774: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 29946 1726882615.36874: in VariableManager get_vars() 29946 1726882615.36883: done with get_vars() 29946 1726882615.36952: done processing included file 29946 1726882615.36953: iterating over new_blocks loaded from include file 29946 1726882615.36954: in VariableManager get_vars() 29946 1726882615.36961: done with get_vars() 29946 1726882615.36962: filtering new block on tags 29946 1726882615.36972: done filtering new block on tags 29946 1726882615.36973: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed_node2 29946 1726882615.36976: extending task lists for all hosts with included blocks 29946 1726882615.37065: done extending task lists 29946 1726882615.37066: done processing included files 29946 1726882615.37067: results queue empty 29946 1726882615.37067: checking for any_errors_fatal 29946 1726882615.37069: done checking for any_errors_fatal 29946 1726882615.37070: checking for max_fail_percentage 29946 1726882615.37070: done checking for max_fail_percentage 29946 1726882615.37071: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.37071: done checking to see if all hosts have failed 29946 1726882615.37072: getting the remaining hosts for this loop 29946 1726882615.37072: done getting the remaining hosts for this loop 29946 1726882615.37074: getting the next task for host managed_node2 29946 1726882615.37076: done getting next task for host managed_node2 29946 1726882615.37078: ^ task is: TASK: Include the task 'get_interface_stat.yml' 29946 1726882615.37079: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.37080: getting variables 29946 1726882615.37081: in VariableManager get_vars() 29946 1726882615.37086: Calling all_inventory to load vars for managed_node2 29946 1726882615.37087: Calling groups_inventory to load vars for managed_node2 29946 1726882615.37088: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.37094: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.37096: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.37098: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.37739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.38675: done with get_vars() 29946 1726882615.38701: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Friday 20 September 2024 21:36:55 -0400 (0:00:00.047) 0:00:41.497 ****** 29946 1726882615.38748: entering _queue_task() for managed_node2/include_tasks 29946 1726882615.38966: worker is 1 (out of 1 available) 29946 1726882615.38977: exiting _queue_task() for managed_node2/include_tasks 29946 1726882615.38988: done queuing things up, now waiting for results queue to drain 29946 1726882615.38989: waiting for pending results... 29946 1726882615.39322: running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' 29946 1726882615.39386: in run() - task 12673a56-9f93-95e7-9dfb-0000000006c4 29946 1726882615.39409: variable 'ansible_search_path' from source: unknown 29946 1726882615.39422: variable 'ansible_search_path' from source: unknown 29946 1726882615.39469: calling self._execute() 29946 1726882615.39572: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.39635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.39639: variable 'omit' from source: magic vars 29946 1726882615.39969: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.39980: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.39986: _execute() done 29946 1726882615.39991: dumping result to json 29946 1726882615.39997: done dumping result, returning 29946 1726882615.40000: done running TaskExecutor() for managed_node2/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-95e7-9dfb-0000000006c4] 29946 1726882615.40005: sending task result for task 12673a56-9f93-95e7-9dfb-0000000006c4 29946 1726882615.40088: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000006c4 29946 1726882615.40092: WORKER PROCESS EXITING 29946 1726882615.40133: no more pending results, returning what we have 29946 1726882615.40138: in VariableManager get_vars() 29946 1726882615.40169: Calling all_inventory to load vars for managed_node2 29946 1726882615.40171: Calling groups_inventory to load vars for managed_node2 29946 1726882615.40174: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.40185: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.40188: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.40197: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.41204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.42667: done with get_vars() 29946 1726882615.42680: variable 'ansible_search_path' from source: unknown 29946 1726882615.42681: variable 'ansible_search_path' from source: unknown 29946 1726882615.42707: we have included files to process 29946 1726882615.42708: generating all_blocks data 29946 1726882615.42709: done generating all_blocks data 29946 1726882615.42710: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882615.42711: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882615.42713: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 29946 1726882615.42845: done processing included file 29946 1726882615.42847: iterating over new_blocks loaded from include file 29946 1726882615.42848: in VariableManager get_vars() 29946 1726882615.42859: done with get_vars() 29946 1726882615.42860: filtering new block on tags 29946 1726882615.42873: done filtering new block on tags 29946 1726882615.42875: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node2 29946 1726882615.42879: extending task lists for all hosts with included blocks 29946 1726882615.42961: done extending task lists 29946 1726882615.42962: done processing included files 29946 1726882615.42963: results queue empty 29946 1726882615.42964: checking for any_errors_fatal 29946 1726882615.42966: done checking for any_errors_fatal 29946 1726882615.42967: checking for max_fail_percentage 29946 1726882615.42968: done checking for max_fail_percentage 29946 1726882615.42968: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.42969: done checking to see if all hosts have failed 29946 1726882615.42970: getting the remaining hosts for this loop 29946 1726882615.42971: done getting the remaining hosts for this loop 29946 1726882615.42973: getting the next task for host managed_node2 29946 1726882615.42976: done getting next task for host managed_node2 29946 1726882615.42978: ^ task is: TASK: Get stat for interface {{ interface }} 29946 1726882615.42980: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.42982: getting variables 29946 1726882615.42983: in VariableManager get_vars() 29946 1726882615.42990: Calling all_inventory to load vars for managed_node2 29946 1726882615.42991: Calling groups_inventory to load vars for managed_node2 29946 1726882615.42995: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.42999: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.43001: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.43003: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.43886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.45277: done with get_vars() 29946 1726882615.45299: done getting variables 29946 1726882615.45453: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:36:55 -0400 (0:00:00.067) 0:00:41.564 ****** 29946 1726882615.45483: entering _queue_task() for managed_node2/stat 29946 1726882615.45769: worker is 1 (out of 1 available) 29946 1726882615.45781: exiting _queue_task() for managed_node2/stat 29946 1726882615.45791: done queuing things up, now waiting for results queue to drain 29946 1726882615.45795: waiting for pending results... 29946 1726882615.46110: running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 29946 1726882615.46116: in run() - task 12673a56-9f93-95e7-9dfb-0000000006de 29946 1726882615.46126: variable 'ansible_search_path' from source: unknown 29946 1726882615.46133: variable 'ansible_search_path' from source: unknown 29946 1726882615.46172: calling self._execute() 29946 1726882615.46281: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.46298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.46317: variable 'omit' from source: magic vars 29946 1726882615.46689: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.46711: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.46723: variable 'omit' from source: magic vars 29946 1726882615.46778: variable 'omit' from source: magic vars 29946 1726882615.46883: variable 'interface' from source: set_fact 29946 1726882615.46909: variable 'omit' from source: magic vars 29946 1726882615.46953: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882615.47078: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882615.47081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882615.47085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.47087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.47092: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882615.47105: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.47113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.47222: Set connection var ansible_pipelining to False 29946 1726882615.47234: Set connection var ansible_shell_executable to /bin/sh 29946 1726882615.47245: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882615.47255: Set connection var ansible_timeout to 10 29946 1726882615.47266: Set connection var ansible_shell_type to sh 29946 1726882615.47273: Set connection var ansible_connection to ssh 29946 1726882615.47308: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.47317: variable 'ansible_connection' from source: unknown 29946 1726882615.47325: variable 'ansible_module_compression' from source: unknown 29946 1726882615.47332: variable 'ansible_shell_type' from source: unknown 29946 1726882615.47340: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.47347: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.47400: variable 'ansible_pipelining' from source: unknown 29946 1726882615.47403: variable 'ansible_timeout' from source: unknown 29946 1726882615.47406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.47568: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 29946 1726882615.47585: variable 'omit' from source: magic vars 29946 1726882615.47599: starting attempt loop 29946 1726882615.47607: running the handler 29946 1726882615.47631: _low_level_execute_command(): starting 29946 1726882615.47644: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882615.48495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882615.48499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.48502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882615.48505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.48548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.48559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.48578: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.48681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.50591: stdout chunk (state=3): >>>/root <<< 29946 1726882615.50596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.50599: stdout chunk (state=3): >>><<< 29946 1726882615.50601: stderr chunk (state=3): >>><<< 29946 1726882615.50604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882615.50607: _low_level_execute_command(): starting 29946 1726882615.50609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790 `" && echo ansible-tmp-1726882615.505756-31931-190395265712790="` echo /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790 `" ) && sleep 0' 29946 1726882615.51724: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.51730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.51873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found <<< 29946 1726882615.51885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 29946 1726882615.51889: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.51904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.51907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.51910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.52123: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.52187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.54066: stdout chunk (state=3): >>>ansible-tmp-1726882615.505756-31931-190395265712790=/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790 <<< 29946 1726882615.54236: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.54240: stdout chunk (state=3): >>><<< 29946 1726882615.54242: stderr chunk (state=3): >>><<< 29946 1726882615.54259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882615.505756-31931-190395265712790=/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882615.54316: variable 'ansible_module_compression' from source: unknown 29946 1726882615.54398: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 29946 1726882615.54419: variable 'ansible_facts' from source: unknown 29946 1726882615.54523: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py 29946 1726882615.54750: Sending initial data 29946 1726882615.54753: Sent initial data (152 bytes) 29946 1726882615.55309: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.55389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.55423: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.55488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.57040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882615.57113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882615.57184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpvapbywb1 /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py <<< 29946 1726882615.57189: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py" <<< 29946 1726882615.57255: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpvapbywb1" to remote "/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py" <<< 29946 1726882615.58068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.58079: stderr chunk (state=3): >>><<< 29946 1726882615.58197: stdout chunk (state=3): >>><<< 29946 1726882615.58200: done transferring module to remote 29946 1726882615.58202: _low_level_execute_command(): starting 29946 1726882615.58204: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/ /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py && sleep 0' 29946 1726882615.58757: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882615.58775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.58794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882615.58813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.58886: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.58936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.58958: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.58973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.59075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.60845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.60849: stdout chunk (state=3): >>><<< 29946 1726882615.60851: stderr chunk (state=3): >>><<< 29946 1726882615.60869: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882615.60877: _low_level_execute_command(): starting 29946 1726882615.60949: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/AnsiballZ_stat.py && sleep 0' 29946 1726882615.61481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882615.61516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882615.61530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882615.61556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882615.61572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882615.61659: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.61684: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.61703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.61722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.61816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.76624: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 29946 1726882615.77859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882615.77869: stdout chunk (state=3): >>><<< 29946 1726882615.77880: stderr chunk (state=3): >>><<< 29946 1726882615.78030: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882615.78034: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882615.78036: _low_level_execute_command(): starting 29946 1726882615.78038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882615.505756-31931-190395265712790/ > /dev/null 2>&1 && sleep 0' 29946 1726882615.78633: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882615.78710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882615.78759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882615.78780: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882615.78809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882615.78903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882615.80752: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882615.80773: stdout chunk (state=3): >>><<< 29946 1726882615.80784: stderr chunk (state=3): >>><<< 29946 1726882615.80807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882615.80819: handler run complete 29946 1726882615.80843: attempt loop complete, returning result 29946 1726882615.80850: _execute() done 29946 1726882615.80856: dumping result to json 29946 1726882615.80862: done dumping result, returning 29946 1726882615.80981: done running TaskExecutor() for managed_node2/TASK: Get stat for interface ethtest0 [12673a56-9f93-95e7-9dfb-0000000006de] 29946 1726882615.80984: sending task result for task 12673a56-9f93-95e7-9dfb-0000000006de 29946 1726882615.81055: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000006de 29946 1726882615.81058: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false, "stat": { "exists": false } } 29946 1726882615.81123: no more pending results, returning what we have 29946 1726882615.81128: results queue empty 29946 1726882615.81129: checking for any_errors_fatal 29946 1726882615.81131: done checking for any_errors_fatal 29946 1726882615.81131: checking for max_fail_percentage 29946 1726882615.81133: done checking for max_fail_percentage 29946 1726882615.81134: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.81135: done checking to see if all hosts have failed 29946 1726882615.81135: getting the remaining hosts for this loop 29946 1726882615.81137: done getting the remaining hosts for this loop 29946 1726882615.81141: getting the next task for host managed_node2 29946 1726882615.81150: done getting next task for host managed_node2 29946 1726882615.81152: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 29946 1726882615.81155: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.81160: getting variables 29946 1726882615.81162: in VariableManager get_vars() 29946 1726882615.81197: Calling all_inventory to load vars for managed_node2 29946 1726882615.81200: Calling groups_inventory to load vars for managed_node2 29946 1726882615.81205: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.81217: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.81220: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.81224: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.82949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.84662: done with get_vars() 29946 1726882615.84684: done getting variables 29946 1726882615.84746: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 29946 1726882615.84873: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Friday 20 September 2024 21:36:55 -0400 (0:00:00.394) 0:00:41.958 ****** 29946 1726882615.84909: entering _queue_task() for managed_node2/assert 29946 1726882615.85326: worker is 1 (out of 1 available) 29946 1726882615.85336: exiting _queue_task() for managed_node2/assert 29946 1726882615.85344: done queuing things up, now waiting for results queue to drain 29946 1726882615.85345: waiting for pending results... 29946 1726882615.85581: running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' 29946 1726882615.85642: in run() - task 12673a56-9f93-95e7-9dfb-0000000006c5 29946 1726882615.85679: variable 'ansible_search_path' from source: unknown 29946 1726882615.85682: variable 'ansible_search_path' from source: unknown 29946 1726882615.85745: calling self._execute() 29946 1726882615.85833: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.85853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.85869: variable 'omit' from source: magic vars 29946 1726882615.86292: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.86297: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.86300: variable 'omit' from source: magic vars 29946 1726882615.86346: variable 'omit' from source: magic vars 29946 1726882615.86457: variable 'interface' from source: set_fact 29946 1726882615.86498: variable 'omit' from source: magic vars 29946 1726882615.86531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882615.86577: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882615.86616: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882615.86725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.86729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882615.86731: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882615.86733: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.86736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.86812: Set connection var ansible_pipelining to False 29946 1726882615.86824: Set connection var ansible_shell_executable to /bin/sh 29946 1726882615.86840: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882615.86855: Set connection var ansible_timeout to 10 29946 1726882615.86867: Set connection var ansible_shell_type to sh 29946 1726882615.86874: Set connection var ansible_connection to ssh 29946 1726882615.86907: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.86916: variable 'ansible_connection' from source: unknown 29946 1726882615.86922: variable 'ansible_module_compression' from source: unknown 29946 1726882615.86929: variable 'ansible_shell_type' from source: unknown 29946 1726882615.86941: variable 'ansible_shell_executable' from source: unknown 29946 1726882615.86956: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.86959: variable 'ansible_pipelining' from source: unknown 29946 1726882615.87051: variable 'ansible_timeout' from source: unknown 29946 1726882615.87054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.87127: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882615.87145: variable 'omit' from source: magic vars 29946 1726882615.87162: starting attempt loop 29946 1726882615.87172: running the handler 29946 1726882615.87331: variable 'interface_stat' from source: set_fact 29946 1726882615.87346: Evaluated conditional (not interface_stat.stat.exists): True 29946 1726882615.87356: handler run complete 29946 1726882615.87378: attempt loop complete, returning result 29946 1726882615.87399: _execute() done 29946 1726882615.87402: dumping result to json 29946 1726882615.87487: done dumping result, returning 29946 1726882615.87494: done running TaskExecutor() for managed_node2/TASK: Assert that the interface is absent - 'ethtest0' [12673a56-9f93-95e7-9dfb-0000000006c5] 29946 1726882615.87497: sending task result for task 12673a56-9f93-95e7-9dfb-0000000006c5 29946 1726882615.87562: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000006c5 29946 1726882615.87566: WORKER PROCESS EXITING ok: [managed_node2] => { "changed": false } MSG: All assertions passed 29946 1726882615.87620: no more pending results, returning what we have 29946 1726882615.87624: results queue empty 29946 1726882615.87626: checking for any_errors_fatal 29946 1726882615.87635: done checking for any_errors_fatal 29946 1726882615.87636: checking for max_fail_percentage 29946 1726882615.87638: done checking for max_fail_percentage 29946 1726882615.87639: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.87640: done checking to see if all hosts have failed 29946 1726882615.87641: getting the remaining hosts for this loop 29946 1726882615.87642: done getting the remaining hosts for this loop 29946 1726882615.87646: getting the next task for host managed_node2 29946 1726882615.87655: done getting next task for host managed_node2 29946 1726882615.87658: ^ task is: TASK: Verify network state restored to default 29946 1726882615.87661: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.87665: getting variables 29946 1726882615.87667: in VariableManager get_vars() 29946 1726882615.87701: Calling all_inventory to load vars for managed_node2 29946 1726882615.87704: Calling groups_inventory to load vars for managed_node2 29946 1726882615.87708: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.87720: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.87723: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.87726: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.89617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.91284: done with get_vars() 29946 1726882615.91310: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:236 Friday 20 September 2024 21:36:55 -0400 (0:00:00.064) 0:00:42.023 ****** 29946 1726882615.91406: entering _queue_task() for managed_node2/include_tasks 29946 1726882615.91848: worker is 1 (out of 1 available) 29946 1726882615.91859: exiting _queue_task() for managed_node2/include_tasks 29946 1726882615.91868: done queuing things up, now waiting for results queue to drain 29946 1726882615.91869: waiting for pending results... 29946 1726882615.92173: running TaskExecutor() for managed_node2/TASK: Verify network state restored to default 29946 1726882615.92178: in run() - task 12673a56-9f93-95e7-9dfb-0000000000ab 29946 1726882615.92181: variable 'ansible_search_path' from source: unknown 29946 1726882615.92219: calling self._execute() 29946 1726882615.92331: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882615.92343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882615.92380: variable 'omit' from source: magic vars 29946 1726882615.92744: variable 'ansible_distribution_major_version' from source: facts 29946 1726882615.92763: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882615.92819: _execute() done 29946 1726882615.92823: dumping result to json 29946 1726882615.92825: done dumping result, returning 29946 1726882615.92827: done running TaskExecutor() for managed_node2/TASK: Verify network state restored to default [12673a56-9f93-95e7-9dfb-0000000000ab] 29946 1726882615.92829: sending task result for task 12673a56-9f93-95e7-9dfb-0000000000ab 29946 1726882615.92947: no more pending results, returning what we have 29946 1726882615.92953: in VariableManager get_vars() 29946 1726882615.92987: Calling all_inventory to load vars for managed_node2 29946 1726882615.92992: Calling groups_inventory to load vars for managed_node2 29946 1726882615.92998: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.93012: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.93016: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.93019: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.93709: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000000ab 29946 1726882615.93712: WORKER PROCESS EXITING 29946 1726882615.94662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882615.96353: done with get_vars() 29946 1726882615.96379: variable 'ansible_search_path' from source: unknown 29946 1726882615.96398: we have included files to process 29946 1726882615.96403: generating all_blocks data 29946 1726882615.96405: done generating all_blocks data 29946 1726882615.96409: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29946 1726882615.96410: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29946 1726882615.96413: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 29946 1726882615.96830: done processing included file 29946 1726882615.96832: iterating over new_blocks loaded from include file 29946 1726882615.96839: in VariableManager get_vars() 29946 1726882615.96853: done with get_vars() 29946 1726882615.96854: filtering new block on tags 29946 1726882615.96873: done filtering new block on tags 29946 1726882615.96875: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node2 29946 1726882615.96881: extending task lists for all hosts with included blocks 29946 1726882615.97146: done extending task lists 29946 1726882615.97148: done processing included files 29946 1726882615.97148: results queue empty 29946 1726882615.97149: checking for any_errors_fatal 29946 1726882615.97152: done checking for any_errors_fatal 29946 1726882615.97153: checking for max_fail_percentage 29946 1726882615.97154: done checking for max_fail_percentage 29946 1726882615.97155: checking to see if all hosts have failed and the running result is not ok 29946 1726882615.97156: done checking to see if all hosts have failed 29946 1726882615.97157: getting the remaining hosts for this loop 29946 1726882615.97158: done getting the remaining hosts for this loop 29946 1726882615.97166: getting the next task for host managed_node2 29946 1726882615.97171: done getting next task for host managed_node2 29946 1726882615.97173: ^ task is: TASK: Check routes and DNS 29946 1726882615.97176: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882615.97178: getting variables 29946 1726882615.97179: in VariableManager get_vars() 29946 1726882615.97190: Calling all_inventory to load vars for managed_node2 29946 1726882615.97194: Calling groups_inventory to load vars for managed_node2 29946 1726882615.97197: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882615.97203: Calling all_plugins_play to load vars for managed_node2 29946 1726882615.97205: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882615.97208: Calling groups_plugins_play to load vars for managed_node2 29946 1726882615.98548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.00063: done with get_vars() 29946 1726882616.00097: done getting variables 29946 1726882616.00143: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:36:56 -0400 (0:00:00.087) 0:00:42.111 ****** 29946 1726882616.00175: entering _queue_task() for managed_node2/shell 29946 1726882616.00718: worker is 1 (out of 1 available) 29946 1726882616.00728: exiting _queue_task() for managed_node2/shell 29946 1726882616.00739: done queuing things up, now waiting for results queue to drain 29946 1726882616.00740: waiting for pending results... 29946 1726882616.00891: running TaskExecutor() for managed_node2/TASK: Check routes and DNS 29946 1726882616.01051: in run() - task 12673a56-9f93-95e7-9dfb-0000000006f6 29946 1726882616.01056: variable 'ansible_search_path' from source: unknown 29946 1726882616.01059: variable 'ansible_search_path' from source: unknown 29946 1726882616.01061: calling self._execute() 29946 1726882616.01300: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.01305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.01307: variable 'omit' from source: magic vars 29946 1726882616.01770: variable 'ansible_distribution_major_version' from source: facts 29946 1726882616.01809: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882616.01813: variable 'omit' from source: magic vars 29946 1726882616.01840: variable 'omit' from source: magic vars 29946 1726882616.01918: variable 'omit' from source: magic vars 29946 1726882616.01921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882616.01967: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882616.01988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882616.02134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882616.02138: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882616.02141: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882616.02143: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.02145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.02187: Set connection var ansible_pipelining to False 29946 1726882616.02195: Set connection var ansible_shell_executable to /bin/sh 29946 1726882616.02198: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882616.02204: Set connection var ansible_timeout to 10 29946 1726882616.02212: Set connection var ansible_shell_type to sh 29946 1726882616.02215: Set connection var ansible_connection to ssh 29946 1726882616.02245: variable 'ansible_shell_executable' from source: unknown 29946 1726882616.02248: variable 'ansible_connection' from source: unknown 29946 1726882616.02251: variable 'ansible_module_compression' from source: unknown 29946 1726882616.02253: variable 'ansible_shell_type' from source: unknown 29946 1726882616.02256: variable 'ansible_shell_executable' from source: unknown 29946 1726882616.02258: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.02260: variable 'ansible_pipelining' from source: unknown 29946 1726882616.02263: variable 'ansible_timeout' from source: unknown 29946 1726882616.02273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.02570: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882616.02574: variable 'omit' from source: magic vars 29946 1726882616.02576: starting attempt loop 29946 1726882616.02579: running the handler 29946 1726882616.02581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882616.02583: _low_level_execute_command(): starting 29946 1726882616.02585: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882616.03249: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.03265: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.03277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.03294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.03334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.03397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.03411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.03441: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.03598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.05286: stdout chunk (state=3): >>>/root <<< 29946 1726882616.05360: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.05365: stdout chunk (state=3): >>><<< 29946 1726882616.05375: stderr chunk (state=3): >>><<< 29946 1726882616.05400: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.05412: _low_level_execute_command(): starting 29946 1726882616.05452: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459 `" && echo ansible-tmp-1726882616.0539787-31954-173570229779459="` echo /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459 `" ) && sleep 0' 29946 1726882616.06077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.06098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.06176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.06236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.06251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.06278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.06382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.08243: stdout chunk (state=3): >>>ansible-tmp-1726882616.0539787-31954-173570229779459=/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459 <<< 29946 1726882616.08488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.08496: stdout chunk (state=3): >>><<< 29946 1726882616.08498: stderr chunk (state=3): >>><<< 29946 1726882616.08501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882616.0539787-31954-173570229779459=/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.08503: variable 'ansible_module_compression' from source: unknown 29946 1726882616.08530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882616.08575: variable 'ansible_facts' from source: unknown 29946 1726882616.08681: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py 29946 1726882616.08860: Sending initial data 29946 1726882616.08864: Sent initial data (156 bytes) 29946 1726882616.09508: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.09520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.09532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.09541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.09633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.11337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882616.11390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882616.11449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpv8hfnlam /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py <<< 29946 1726882616.11453: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py" <<< 29946 1726882616.11536: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpv8hfnlam" to remote "/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py" <<< 29946 1726882616.12377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.12502: stderr chunk (state=3): >>><<< 29946 1726882616.12505: stdout chunk (state=3): >>><<< 29946 1726882616.12508: done transferring module to remote 29946 1726882616.12522: _low_level_execute_command(): starting 29946 1726882616.12531: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/ /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py && sleep 0' 29946 1726882616.13219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.13279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882616.13305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.13417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.13421: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.13454: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.13527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.15312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.15315: stdout chunk (state=3): >>><<< 29946 1726882616.15318: stderr chunk (state=3): >>><<< 29946 1726882616.15338: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.15427: _low_level_execute_command(): starting 29946 1726882616.15431: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/AnsiballZ_command.py && sleep 0' 29946 1726882616.15954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.15966: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.15976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.15991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.16010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882616.16045: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.16057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.16130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.16156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.16264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.31896: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3055sec preferred_lft 3055sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:56.309644", "end": "2024-09-20 21:36:56.317839", "delta": "0:00:00.008195", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882616.33238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.33252: stderr chunk (state=3): >>>Shared connection to 10.31.14.69 closed. <<< 29946 1726882616.33311: stderr chunk (state=3): >>><<< 29946 1726882616.33330: stdout chunk (state=3): >>><<< 29946 1726882616.33361: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3055sec preferred_lft 3055sec\n inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:36:56.309644", "end": "2024-09-20 21:36:56.317839", "delta": "0:00:00.008195", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882616.33419: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882616.33508: _low_level_execute_command(): starting 29946 1726882616.33511: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882616.0539787-31954-173570229779459/ > /dev/null 2>&1 && sleep 0' 29946 1726882616.34026: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.34030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.34064: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.34067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.34070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.34148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.34151: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.34207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.36017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.36021: stdout chunk (state=3): >>><<< 29946 1726882616.36023: stderr chunk (state=3): >>><<< 29946 1726882616.36039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.36099: handler run complete 29946 1726882616.36103: Evaluated conditional (False): False 29946 1726882616.36105: attempt loop complete, returning result 29946 1726882616.36107: _execute() done 29946 1726882616.36110: dumping result to json 29946 1726882616.36114: done dumping result, returning 29946 1726882616.36128: done running TaskExecutor() for managed_node2/TASK: Check routes and DNS [12673a56-9f93-95e7-9dfb-0000000006f6] 29946 1726882616.36137: sending task result for task 12673a56-9f93-95e7-9dfb-0000000006f6 ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008195", "end": "2024-09-20 21:36:56.317839", "rc": 0, "start": "2024-09-20 21:36:56.309644" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:c1:46:63:3b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.69/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3055sec preferred_lft 3055sec inet6 fe80::8ff:c1ff:fe46:633b/64 scope link noprefixroute valid_lft forever preferred_lft forever 24: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether 6e:57:f6:54:9a:30 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.69 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.69 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 29946 1726882616.36332: no more pending results, returning what we have 29946 1726882616.36335: results queue empty 29946 1726882616.36336: checking for any_errors_fatal 29946 1726882616.36338: done checking for any_errors_fatal 29946 1726882616.36338: checking for max_fail_percentage 29946 1726882616.36340: done checking for max_fail_percentage 29946 1726882616.36340: checking to see if all hosts have failed and the running result is not ok 29946 1726882616.36341: done checking to see if all hosts have failed 29946 1726882616.36342: getting the remaining hosts for this loop 29946 1726882616.36343: done getting the remaining hosts for this loop 29946 1726882616.36347: getting the next task for host managed_node2 29946 1726882616.36352: done getting next task for host managed_node2 29946 1726882616.36355: ^ task is: TASK: Verify DNS and network connectivity 29946 1726882616.36357: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882616.36365: getting variables 29946 1726882616.36366: in VariableManager get_vars() 29946 1726882616.36397: Calling all_inventory to load vars for managed_node2 29946 1726882616.36400: Calling groups_inventory to load vars for managed_node2 29946 1726882616.36403: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882616.36610: Calling all_plugins_play to load vars for managed_node2 29946 1726882616.36614: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882616.36619: Calling groups_plugins_play to load vars for managed_node2 29946 1726882616.37141: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000006f6 29946 1726882616.37145: WORKER PROCESS EXITING 29946 1726882616.38246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.39251: done with get_vars() 29946 1726882616.39267: done getting variables 29946 1726882616.39315: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:36:56 -0400 (0:00:00.391) 0:00:42.502 ****** 29946 1726882616.39337: entering _queue_task() for managed_node2/shell 29946 1726882616.39574: worker is 1 (out of 1 available) 29946 1726882616.39587: exiting _queue_task() for managed_node2/shell 29946 1726882616.39604: done queuing things up, now waiting for results queue to drain 29946 1726882616.39605: waiting for pending results... 29946 1726882616.39816: running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity 29946 1726882616.39924: in run() - task 12673a56-9f93-95e7-9dfb-0000000006f7 29946 1726882616.39929: variable 'ansible_search_path' from source: unknown 29946 1726882616.39932: variable 'ansible_search_path' from source: unknown 29946 1726882616.39978: calling self._execute() 29946 1726882616.40202: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.40206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.40209: variable 'omit' from source: magic vars 29946 1726882616.40460: variable 'ansible_distribution_major_version' from source: facts 29946 1726882616.40472: Evaluated conditional (ansible_distribution_major_version != '6'): True 29946 1726882616.40612: variable 'ansible_facts' from source: unknown 29946 1726882616.41548: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 29946 1726882616.41559: variable 'omit' from source: magic vars 29946 1726882616.41604: variable 'omit' from source: magic vars 29946 1726882616.41654: variable 'omit' from source: magic vars 29946 1726882616.41700: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 29946 1726882616.41847: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 29946 1726882616.41851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 29946 1726882616.41853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882616.41856: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 29946 1726882616.41889: variable 'inventory_hostname' from source: host vars for 'managed_node2' 29946 1726882616.41902: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.41913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.42082: Set connection var ansible_pipelining to False 29946 1726882616.42088: Set connection var ansible_shell_executable to /bin/sh 29946 1726882616.42102: Set connection var ansible_module_compression to ZIP_DEFLATED 29946 1726882616.42111: Set connection var ansible_timeout to 10 29946 1726882616.42174: Set connection var ansible_shell_type to sh 29946 1726882616.42177: Set connection var ansible_connection to ssh 29946 1726882616.42182: variable 'ansible_shell_executable' from source: unknown 29946 1726882616.42186: variable 'ansible_connection' from source: unknown 29946 1726882616.42188: variable 'ansible_module_compression' from source: unknown 29946 1726882616.42190: variable 'ansible_shell_type' from source: unknown 29946 1726882616.42192: variable 'ansible_shell_executable' from source: unknown 29946 1726882616.42196: variable 'ansible_host' from source: host vars for 'managed_node2' 29946 1726882616.42197: variable 'ansible_pipelining' from source: unknown 29946 1726882616.42199: variable 'ansible_timeout' from source: unknown 29946 1726882616.42201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node2' 29946 1726882616.42604: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882616.42611: variable 'omit' from source: magic vars 29946 1726882616.42614: starting attempt loop 29946 1726882616.42616: running the handler 29946 1726882616.42619: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 29946 1726882616.42628: _low_level_execute_command(): starting 29946 1726882616.42631: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 29946 1726882616.43326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.43330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.43332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.43335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.43382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.43388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.43453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.45416: stdout chunk (state=3): >>>/root <<< 29946 1726882616.45420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.45422: stdout chunk (state=3): >>><<< 29946 1726882616.45424: stderr chunk (state=3): >>><<< 29946 1726882616.45427: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.45466: _low_level_execute_command(): starting 29946 1726882616.45472: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175 `" && echo ansible-tmp-1726882616.4533622-31978-257517825777175="` echo /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175 `" ) && sleep 0' 29946 1726882616.45970: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.45998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.46017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.46035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.46053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882616.46064: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882616.46078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.46101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882616.46134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.46199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.46210: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.46225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.46286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.48152: stdout chunk (state=3): >>>ansible-tmp-1726882616.4533622-31978-257517825777175=/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175 <<< 29946 1726882616.48297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.48300: stdout chunk (state=3): >>><<< 29946 1726882616.48304: stderr chunk (state=3): >>><<< 29946 1726882616.48501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882616.4533622-31978-257517825777175=/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.48504: variable 'ansible_module_compression' from source: unknown 29946 1726882616.48507: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-29946kfugda57/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 29946 1726882616.48509: variable 'ansible_facts' from source: unknown 29946 1726882616.48556: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py 29946 1726882616.48659: Sending initial data 29946 1726882616.48669: Sent initial data (156 bytes) 29946 1726882616.49058: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.49064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.49069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 <<< 29946 1726882616.49075: stderr chunk (state=3): >>>debug2: match not found <<< 29946 1726882616.49087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.49095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 29946 1726882616.49115: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.49118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.49169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.49172: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.49239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.50754: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 29946 1726882616.50758: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 29946 1726882616.50815: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 29946 1726882616.50877: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-29946kfugda57/tmpsvzupvlz /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py <<< 29946 1726882616.50880: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py" <<< 29946 1726882616.50933: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-29946kfugda57/tmpsvzupvlz" to remote "/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py" <<< 29946 1726882616.51542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.51575: stderr chunk (state=3): >>><<< 29946 1726882616.51578: stdout chunk (state=3): >>><<< 29946 1726882616.51596: done transferring module to remote 29946 1726882616.51603: _low_level_execute_command(): starting 29946 1726882616.51606: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/ /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py && sleep 0' 29946 1726882616.52013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.52018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.52020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.52022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.52024: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.52065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' <<< 29946 1726882616.52068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.52132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.53849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.53868: stderr chunk (state=3): >>><<< 29946 1726882616.53871: stdout chunk (state=3): >>><<< 29946 1726882616.53882: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.53885: _low_level_execute_command(): starting 29946 1726882616.53891: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/AnsiballZ_command.py && sleep 0' 29946 1726882616.54258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.54296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.54299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.54301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration <<< 29946 1726882616.54304: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found <<< 29946 1726882616.54305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 29946 1726882616.54341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.54353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.54424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.75642: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13565 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 14934 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:56.692524", "end": "2024-09-20 21:36:56.755415", "delta": "0:00:00.062891", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 29946 1726882616.77276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. <<< 29946 1726882616.77281: stdout chunk (state=3): >>><<< 29946 1726882616.77292: stderr chunk (state=3): >>><<< 29946 1726882616.77298: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 13565 0 --:--:-- --:--:-- --:--:-- 13863\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 14934 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:36:56.692524", "end": "2024-09-20 21:36:56.755415", "delta": "0:00:00.062891", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.69 closed. 29946 1726882616.77301: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 29946 1726882616.77304: _low_level_execute_command(): starting 29946 1726882616.77306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882616.4533622-31978-257517825777175/ > /dev/null 2>&1 && sleep 0' 29946 1726882616.77873: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 29946 1726882616.77888: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 29946 1726882616.77909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 29946 1726882616.77938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 29946 1726882616.78049: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK <<< 29946 1726882616.78069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 29946 1726882616.78169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 29946 1726882616.80002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 29946 1726882616.80006: stdout chunk (state=3): >>><<< 29946 1726882616.80009: stderr chunk (state=3): >>><<< 29946 1726882616.80030: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.69 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.69 originally 10.31.14.69 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/6f323b04b0' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 29946 1726882616.80053: handler run complete 29946 1726882616.80079: Evaluated conditional (False): False 29946 1726882616.80097: attempt loop complete, returning result 29946 1726882616.80105: _execute() done 29946 1726882616.80111: dumping result to json 29946 1726882616.80149: done dumping result, returning 29946 1726882616.80152: done running TaskExecutor() for managed_node2/TASK: Verify DNS and network connectivity [12673a56-9f93-95e7-9dfb-0000000006f7] 29946 1726882616.80155: sending task result for task 12673a56-9f93-95e7-9dfb-0000000006f7 ok: [managed_node2] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.062891", "end": "2024-09-20 21:36:56.755415", "rc": 0, "start": "2024-09-20 21:36:56.692524" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 13565 0 --:--:-- --:--:-- --:--:-- 13863 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 14934 0 --:--:-- --:--:-- --:--:-- 15315 29946 1726882616.80467: no more pending results, returning what we have 29946 1726882616.80471: results queue empty 29946 1726882616.80471: checking for any_errors_fatal 29946 1726882616.80481: done checking for any_errors_fatal 29946 1726882616.80482: checking for max_fail_percentage 29946 1726882616.80484: done checking for max_fail_percentage 29946 1726882616.80485: checking to see if all hosts have failed and the running result is not ok 29946 1726882616.80486: done checking to see if all hosts have failed 29946 1726882616.80487: getting the remaining hosts for this loop 29946 1726882616.80488: done getting the remaining hosts for this loop 29946 1726882616.80497: getting the next task for host managed_node2 29946 1726882616.80506: done getting next task for host managed_node2 29946 1726882616.80509: ^ task is: TASK: meta (flush_handlers) 29946 1726882616.80512: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882616.80517: getting variables 29946 1726882616.80518: in VariableManager get_vars() 29946 1726882616.80551: Calling all_inventory to load vars for managed_node2 29946 1726882616.80554: Calling groups_inventory to load vars for managed_node2 29946 1726882616.80558: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882616.80569: Calling all_plugins_play to load vars for managed_node2 29946 1726882616.80572: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882616.80575: Calling groups_plugins_play to load vars for managed_node2 29946 1726882616.81185: done sending task result for task 12673a56-9f93-95e7-9dfb-0000000006f7 29946 1726882616.81191: WORKER PROCESS EXITING 29946 1726882616.81848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.82728: done with get_vars() 29946 1726882616.82745: done getting variables 29946 1726882616.82799: in VariableManager get_vars() 29946 1726882616.82806: Calling all_inventory to load vars for managed_node2 29946 1726882616.82808: Calling groups_inventory to load vars for managed_node2 29946 1726882616.82809: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882616.82812: Calling all_plugins_play to load vars for managed_node2 29946 1726882616.82814: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882616.82815: Calling groups_plugins_play to load vars for managed_node2 29946 1726882616.87316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.88339: done with get_vars() 29946 1726882616.88360: done queuing things up, now waiting for results queue to drain 29946 1726882616.88361: results queue empty 29946 1726882616.88362: checking for any_errors_fatal 29946 1726882616.88364: done checking for any_errors_fatal 29946 1726882616.88365: checking for max_fail_percentage 29946 1726882616.88365: done checking for max_fail_percentage 29946 1726882616.88366: checking to see if all hosts have failed and the running result is not ok 29946 1726882616.88366: done checking to see if all hosts have failed 29946 1726882616.88367: getting the remaining hosts for this loop 29946 1726882616.88367: done getting the remaining hosts for this loop 29946 1726882616.88369: getting the next task for host managed_node2 29946 1726882616.88372: done getting next task for host managed_node2 29946 1726882616.88373: ^ task is: TASK: meta (flush_handlers) 29946 1726882616.88374: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882616.88376: getting variables 29946 1726882616.88376: in VariableManager get_vars() 29946 1726882616.88382: Calling all_inventory to load vars for managed_node2 29946 1726882616.88383: Calling groups_inventory to load vars for managed_node2 29946 1726882616.88385: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882616.88388: Calling all_plugins_play to load vars for managed_node2 29946 1726882616.88392: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882616.88396: Calling groups_plugins_play to load vars for managed_node2 29946 1726882616.89051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.90079: done with get_vars() 29946 1726882616.90097: done getting variables 29946 1726882616.90132: in VariableManager get_vars() 29946 1726882616.90138: Calling all_inventory to load vars for managed_node2 29946 1726882616.90139: Calling groups_inventory to load vars for managed_node2 29946 1726882616.90141: Calling all_plugins_inventory to load vars for managed_node2 29946 1726882616.90144: Calling all_plugins_play to load vars for managed_node2 29946 1726882616.90148: Calling groups_plugins_inventory to load vars for managed_node2 29946 1726882616.90150: Calling groups_plugins_play to load vars for managed_node2 29946 1726882616.90839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 29946 1726882616.92011: done with get_vars() 29946 1726882616.92034: done queuing things up, now waiting for results queue to drain 29946 1726882616.92036: results queue empty 29946 1726882616.92037: checking for any_errors_fatal 29946 1726882616.92038: done checking for any_errors_fatal 29946 1726882616.92039: checking for max_fail_percentage 29946 1726882616.92039: done checking for max_fail_percentage 29946 1726882616.92040: checking to see if all hosts have failed and the running result is not ok 29946 1726882616.92041: done checking to see if all hosts have failed 29946 1726882616.92042: getting the remaining hosts for this loop 29946 1726882616.92042: done getting the remaining hosts for this loop 29946 1726882616.92045: getting the next task for host managed_node2 29946 1726882616.92048: done getting next task for host managed_node2 29946 1726882616.92049: ^ task is: None 29946 1726882616.92050: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 29946 1726882616.92051: done queuing things up, now waiting for results queue to drain 29946 1726882616.92052: results queue empty 29946 1726882616.92053: checking for any_errors_fatal 29946 1726882616.92053: done checking for any_errors_fatal 29946 1726882616.92054: checking for max_fail_percentage 29946 1726882616.92055: done checking for max_fail_percentage 29946 1726882616.92055: checking to see if all hosts have failed and the running result is not ok 29946 1726882616.92056: done checking to see if all hosts have failed 29946 1726882616.92057: getting the next task for host managed_node2 29946 1726882616.92059: done getting next task for host managed_node2 29946 1726882616.92060: ^ task is: None 29946 1726882616.92061: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node2 : ok=87 changed=5 unreachable=0 failed=0 skipped=73 rescued=0 ignored=1 Friday 20 September 2024 21:36:56 -0400 (0:00:00.527) 0:00:43.030 ****** =============================================================================== Gathering Facts --------------------------------------------------------- 2.06s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.90s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.27s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_routing_rules_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.17s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Create veth interface ethtest0 ------------------------------------------ 1.17s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Install iproute --------------------------------------------------------- 1.10s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Gathering Facts --------------------------------------------------------- 1.06s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:3 Gathering Facts --------------------------------------------------------- 1.01s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_routing_rules.yml:227 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.99s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.95s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.89s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.89s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.87s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.82s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.72s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.62s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 29946 1726882616.92137: RUNNING CLEANUP