22690 1727204232.10985: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-MVC executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 22690 1727204232.11311: Added group all to inventory 22690 1727204232.11313: Added group ungrouped to inventory 22690 1727204232.11318: Group all now contains ungrouped 22690 1727204232.11321: Examining possible inventory source: /tmp/network-jrl/inventory-0Xx.yml 22690 1727204232.26020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 22690 1727204232.26098: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 22690 1727204232.26128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 22690 1727204232.26197: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 22690 1727204232.26316: Loaded config def from plugin (inventory/script) 22690 1727204232.26319: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 22690 1727204232.26364: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 22690 1727204232.26491: Loaded config def from plugin (inventory/yaml) 22690 1727204232.26493: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 22690 1727204232.26602: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 22690 1727204232.27150: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 22690 1727204232.27156: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 22690 1727204232.27160: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 22690 1727204232.27177: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 22690 1727204232.27185: Loading data from /tmp/network-jrl/inventory-0Xx.yml 22690 1727204232.27269: /tmp/network-jrl/inventory-0Xx.yml was not parsable by auto 22690 1727204232.27348: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 22690 1727204232.27398: Loading data from /tmp/network-jrl/inventory-0Xx.yml 22690 1727204232.27499: group all already in inventory 22690 1727204232.27507: set inventory_file for managed-node1 22690 1727204232.27512: set inventory_dir for managed-node1 22690 1727204232.27513: Added host managed-node1 to inventory 22690 1727204232.27515: Added host managed-node1 to group all 22690 1727204232.27516: set ansible_host for managed-node1 22690 1727204232.27517: set ansible_ssh_extra_args for managed-node1 22690 1727204232.27521: set inventory_file for managed-node2 22690 1727204232.27524: set inventory_dir for managed-node2 22690 1727204232.27525: Added host managed-node2 to inventory 22690 1727204232.27527: Added host managed-node2 to group all 22690 1727204232.27528: set ansible_host for managed-node2 22690 1727204232.27528: set ansible_ssh_extra_args for managed-node2 22690 1727204232.27531: set inventory_file for managed-node3 22690 1727204232.27534: set inventory_dir for managed-node3 22690 1727204232.27535: Added host managed-node3 to inventory 22690 1727204232.27536: Added host managed-node3 to group all 22690 1727204232.27537: set ansible_host for managed-node3 22690 1727204232.27538: set ansible_ssh_extra_args for managed-node3 22690 1727204232.27541: Reconcile groups and hosts in inventory. 22690 1727204232.27547: Group ungrouped now contains managed-node1 22690 1727204232.27549: Group ungrouped now contains managed-node2 22690 1727204232.27550: Group ungrouped now contains managed-node3 22690 1727204232.27650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 22690 1727204232.27817: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 22690 1727204232.27876: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 22690 1727204232.27915: Loaded config def from plugin (vars/host_group_vars) 22690 1727204232.27917: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 22690 1727204232.27925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 22690 1727204232.27934: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 22690 1727204232.27985: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 22690 1727204232.28404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204232.28517: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 22690 1727204232.28573: Loaded config def from plugin (connection/local) 22690 1727204232.28577: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 22690 1727204232.29407: Loaded config def from plugin (connection/paramiko_ssh) 22690 1727204232.29412: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 22690 1727204232.30505: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22690 1727204232.30566: Loaded config def from plugin (connection/psrp) 22690 1727204232.30570: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 22690 1727204232.31461: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22690 1727204232.31519: Loaded config def from plugin (connection/ssh) 22690 1727204232.31523: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 22690 1727204232.33954: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22690 1727204232.34003: Loaded config def from plugin (connection/winrm) 22690 1727204232.34007: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 22690 1727204232.34053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 22690 1727204232.34127: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 22690 1727204232.34213: Loaded config def from plugin (shell/cmd) 22690 1727204232.34216: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 22690 1727204232.34253: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 22690 1727204232.34328: Loaded config def from plugin (shell/powershell) 22690 1727204232.34331: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 22690 1727204232.34397: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 22690 1727204232.34615: Loaded config def from plugin (shell/sh) 22690 1727204232.34617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 22690 1727204232.34655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 22690 1727204232.34808: Loaded config def from plugin (become/runas) 22690 1727204232.34811: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 22690 1727204232.35035: Loaded config def from plugin (become/su) 22690 1727204232.35037: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 22690 1727204232.35254: Loaded config def from plugin (become/sudo) 22690 1727204232.35259: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 22690 1727204232.35311: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 22690 1727204232.35753: in VariableManager get_vars() 22690 1727204232.35780: done with get_vars() 22690 1727204232.35987: trying /usr/local/lib/python3.12/site-packages/ansible/modules 22690 1727204232.40500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 22690 1727204232.40641: in VariableManager get_vars() 22690 1727204232.40647: done with get_vars() 22690 1727204232.40650: variable 'playbook_dir' from source: magic vars 22690 1727204232.40651: variable 'ansible_playbook_python' from source: magic vars 22690 1727204232.40652: variable 'ansible_config_file' from source: magic vars 22690 1727204232.40652: variable 'groups' from source: magic vars 22690 1727204232.40653: variable 'omit' from source: magic vars 22690 1727204232.40654: variable 'ansible_version' from source: magic vars 22690 1727204232.40655: variable 'ansible_check_mode' from source: magic vars 22690 1727204232.40656: variable 'ansible_diff_mode' from source: magic vars 22690 1727204232.40656: variable 'ansible_forks' from source: magic vars 22690 1727204232.40657: variable 'ansible_inventory_sources' from source: magic vars 22690 1727204232.40658: variable 'ansible_skip_tags' from source: magic vars 22690 1727204232.40659: variable 'ansible_limit' from source: magic vars 22690 1727204232.40659: variable 'ansible_run_tags' from source: magic vars 22690 1727204232.40660: variable 'ansible_verbosity' from source: magic vars 22690 1727204232.40713: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 22690 1727204232.41525: in VariableManager get_vars() 22690 1727204232.41543: done with get_vars() 22690 1727204232.41601: in VariableManager get_vars() 22690 1727204232.41624: done with get_vars() 22690 1727204232.41660: in VariableManager get_vars() 22690 1727204232.41684: done with get_vars() 22690 1727204232.41718: in VariableManager get_vars() 22690 1727204232.41730: done with get_vars() 22690 1727204232.41821: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22690 1727204232.42074: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22690 1727204232.42298: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22690 1727204232.43953: in VariableManager get_vars() 22690 1727204232.44102: done with get_vars() 22690 1727204232.45124: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 22690 1727204232.45531: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204232.48719: in VariableManager get_vars() 22690 1727204232.48742: done with get_vars() 22690 1727204232.49127: in VariableManager get_vars() 22690 1727204232.49132: done with get_vars() 22690 1727204232.49135: variable 'playbook_dir' from source: magic vars 22690 1727204232.49136: variable 'ansible_playbook_python' from source: magic vars 22690 1727204232.49137: variable 'ansible_config_file' from source: magic vars 22690 1727204232.49138: variable 'groups' from source: magic vars 22690 1727204232.49138: variable 'omit' from source: magic vars 22690 1727204232.49139: variable 'ansible_version' from source: magic vars 22690 1727204232.49140: variable 'ansible_check_mode' from source: magic vars 22690 1727204232.49141: variable 'ansible_diff_mode' from source: magic vars 22690 1727204232.49142: variable 'ansible_forks' from source: magic vars 22690 1727204232.49142: variable 'ansible_inventory_sources' from source: magic vars 22690 1727204232.49143: variable 'ansible_skip_tags' from source: magic vars 22690 1727204232.49144: variable 'ansible_limit' from source: magic vars 22690 1727204232.49145: variable 'ansible_run_tags' from source: magic vars 22690 1727204232.49146: variable 'ansible_verbosity' from source: magic vars 22690 1727204232.49348: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 22690 1727204232.49477: in VariableManager get_vars() 22690 1727204232.49481: done with get_vars() 22690 1727204232.49484: variable 'playbook_dir' from source: magic vars 22690 1727204232.49485: variable 'ansible_playbook_python' from source: magic vars 22690 1727204232.49485: variable 'ansible_config_file' from source: magic vars 22690 1727204232.49486: variable 'groups' from source: magic vars 22690 1727204232.49487: variable 'omit' from source: magic vars 22690 1727204232.49488: variable 'ansible_version' from source: magic vars 22690 1727204232.49489: variable 'ansible_check_mode' from source: magic vars 22690 1727204232.49490: variable 'ansible_diff_mode' from source: magic vars 22690 1727204232.49490: variable 'ansible_forks' from source: magic vars 22690 1727204232.49491: variable 'ansible_inventory_sources' from source: magic vars 22690 1727204232.49492: variable 'ansible_skip_tags' from source: magic vars 22690 1727204232.49493: variable 'ansible_limit' from source: magic vars 22690 1727204232.49493: variable 'ansible_run_tags' from source: magic vars 22690 1727204232.49494: variable 'ansible_verbosity' from source: magic vars 22690 1727204232.49535: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 22690 1727204232.49872: in VariableManager get_vars() 22690 1727204232.49890: done with get_vars() 22690 1727204232.49940: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22690 1727204232.50271: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22690 1727204232.50420: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22690 1727204232.51654: in VariableManager get_vars() 22690 1727204232.51683: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204232.55577: in VariableManager get_vars() 22690 1727204232.55669: done with get_vars() 22690 1727204232.55829: in VariableManager get_vars() 22690 1727204232.55834: done with get_vars() 22690 1727204232.55836: variable 'playbook_dir' from source: magic vars 22690 1727204232.55837: variable 'ansible_playbook_python' from source: magic vars 22690 1727204232.55838: variable 'ansible_config_file' from source: magic vars 22690 1727204232.55839: variable 'groups' from source: magic vars 22690 1727204232.55840: variable 'omit' from source: magic vars 22690 1727204232.55841: variable 'ansible_version' from source: magic vars 22690 1727204232.55841: variable 'ansible_check_mode' from source: magic vars 22690 1727204232.55842: variable 'ansible_diff_mode' from source: magic vars 22690 1727204232.55843: variable 'ansible_forks' from source: magic vars 22690 1727204232.55844: variable 'ansible_inventory_sources' from source: magic vars 22690 1727204232.55845: variable 'ansible_skip_tags' from source: magic vars 22690 1727204232.55850: variable 'ansible_limit' from source: magic vars 22690 1727204232.55850: variable 'ansible_run_tags' from source: magic vars 22690 1727204232.55851: variable 'ansible_verbosity' from source: magic vars 22690 1727204232.55893: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 22690 1727204232.56099: in VariableManager get_vars() 22690 1727204232.56113: done with get_vars() 22690 1727204232.56283: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22690 1727204232.61022: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22690 1727204232.61117: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22690 1727204232.61589: in VariableManager get_vars() 22690 1727204232.61614: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204232.64681: in VariableManager get_vars() 22690 1727204232.64698: done with get_vars() 22690 1727204232.64777: in VariableManager get_vars() 22690 1727204232.64791: done with get_vars() 22690 1727204232.64976: in VariableManager get_vars() 22690 1727204232.64990: done with get_vars() 22690 1727204232.65214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 22690 1727204232.65230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 22690 1727204232.65849: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 22690 1727204232.66294: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 22690 1727204232.66298: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 22690 1727204232.66334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 22690 1727204232.66364: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 22690 1727204232.66699: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 22690 1727204232.66891: Loaded config def from plugin (callback/default) 22690 1727204232.66895: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204232.70918: Loaded config def from plugin (callback/junit) 22690 1727204232.70922: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204232.71194: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 22690 1727204232.71479: Loaded config def from plugin (callback/minimal) 22690 1727204232.71484: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204232.71539: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204232.71609: Loaded config def from plugin (callback/tree) 22690 1727204232.71612: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 22690 1727204232.72158: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 22690 1727204232.72161: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-MVC/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 22690 1727204232.72395: in VariableManager get_vars() 22690 1727204232.72413: done with get_vars() 22690 1727204232.72419: in VariableManager get_vars() 22690 1727204232.72429: done with get_vars() 22690 1727204232.72433: variable 'omit' from source: magic vars 22690 1727204232.72683: in VariableManager get_vars() 22690 1727204232.72700: done with get_vars() 22690 1727204232.72724: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 22690 1727204232.74609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 22690 1727204232.74710: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 22690 1727204232.74977: getting the remaining hosts for this loop 22690 1727204232.74979: done getting the remaining hosts for this loop 22690 1727204232.74983: getting the next task for host managed-node2 22690 1727204232.74988: done getting next task for host managed-node2 22690 1727204232.74990: ^ task is: TASK: Gathering Facts 22690 1727204232.74992: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204232.75000: getting variables 22690 1727204232.75001: in VariableManager get_vars() 22690 1727204232.75022: Calling all_inventory to load vars for managed-node2 22690 1727204232.75025: Calling groups_inventory to load vars for managed-node2 22690 1727204232.75028: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204232.75043: Calling all_plugins_play to load vars for managed-node2 22690 1727204232.75056: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204232.75059: Calling groups_plugins_play to load vars for managed-node2 22690 1727204232.75102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204232.75364: done with get_vars() 22690 1727204232.75377: done getting variables 22690 1727204232.75575: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Tuesday 24 September 2024 14:57:12 -0400 (0:00:00.039) 0:00:00.039 ***** 22690 1727204232.75604: entering _queue_task() for managed-node2/gather_facts 22690 1727204232.75606: Creating lock for gather_facts 22690 1727204232.76356: worker is 1 (out of 1 available) 22690 1727204232.76772: exiting _queue_task() for managed-node2/gather_facts 22690 1727204232.76785: done queuing things up, now waiting for results queue to drain 22690 1727204232.76787: waiting for pending results... 22690 1727204232.77105: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204232.77163: in run() - task 127b8e07-fff9-78bb-bf56-00000000007c 22690 1727204232.77251: variable 'ansible_search_path' from source: unknown 22690 1727204232.77293: calling self._execute() 22690 1727204232.77480: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204232.77487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204232.77498: variable 'omit' from source: magic vars 22690 1727204232.77801: variable 'omit' from source: magic vars 22690 1727204232.77805: variable 'omit' from source: magic vars 22690 1727204232.77909: variable 'omit' from source: magic vars 22690 1727204232.77959: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204232.78104: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204232.78280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204232.78284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204232.78287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204232.78290: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204232.78292: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204232.78294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204232.78506: Set connection var ansible_connection to ssh 22690 1727204232.78670: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204232.78674: Set connection var ansible_pipelining to False 22690 1727204232.78677: Set connection var ansible_shell_type to sh 22690 1727204232.78679: Set connection var ansible_shell_executable to /bin/sh 22690 1727204232.78682: Set connection var ansible_timeout to 10 22690 1727204232.78695: variable 'ansible_shell_executable' from source: unknown 22690 1727204232.78699: variable 'ansible_connection' from source: unknown 22690 1727204232.78701: variable 'ansible_module_compression' from source: unknown 22690 1727204232.78704: variable 'ansible_shell_type' from source: unknown 22690 1727204232.78706: variable 'ansible_shell_executable' from source: unknown 22690 1727204232.78709: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204232.78711: variable 'ansible_pipelining' from source: unknown 22690 1727204232.78714: variable 'ansible_timeout' from source: unknown 22690 1727204232.78719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204232.79155: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204232.79177: variable 'omit' from source: magic vars 22690 1727204232.79278: starting attempt loop 22690 1727204232.79282: running the handler 22690 1727204232.79300: variable 'ansible_facts' from source: unknown 22690 1727204232.79330: _low_level_execute_command(): starting 22690 1727204232.79338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204232.80975: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204232.81183: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204232.81225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204232.81286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204232.81400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204232.83195: stdout chunk (state=3): >>>/root <<< 22690 1727204232.83307: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204232.83673: stderr chunk (state=3): >>><<< 22690 1727204232.83677: stdout chunk (state=3): >>><<< 22690 1727204232.83681: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204232.83684: _low_level_execute_command(): starting 22690 1727204232.83688: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235 `" && echo ansible-tmp-1727204232.8365285-22782-52804327140235="` echo /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235 `" ) && sleep 0' 22690 1727204232.84924: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204232.84938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204232.84954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204232.84978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204232.84995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204232.85006: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204232.85020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204232.85039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204232.85064: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204232.85080: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204232.85092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204232.85105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204232.85120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204232.85180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204232.85218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204232.85248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204232.85261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204232.85376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204232.87386: stdout chunk (state=3): >>>ansible-tmp-1727204232.8365285-22782-52804327140235=/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235 <<< 22690 1727204232.87628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204232.87651: stderr chunk (state=3): >>><<< 22690 1727204232.87662: stdout chunk (state=3): >>><<< 22690 1727204232.87690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204232.8365285-22782-52804327140235=/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204232.87954: variable 'ansible_module_compression' from source: unknown 22690 1727204232.87957: ANSIBALLZ: Using generic lock for ansible.legacy.setup 22690 1727204232.87959: ANSIBALLZ: Acquiring lock 22690 1727204232.87962: ANSIBALLZ: Lock acquired: 139846653776800 22690 1727204232.87964: ANSIBALLZ: Creating module 22690 1727204233.59807: ANSIBALLZ: Writing module into payload 22690 1727204233.60187: ANSIBALLZ: Writing module 22690 1727204233.60233: ANSIBALLZ: Renaming module 22690 1727204233.60435: ANSIBALLZ: Done creating module 22690 1727204233.60438: variable 'ansible_facts' from source: unknown 22690 1727204233.60441: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204233.60444: _low_level_execute_command(): starting 22690 1727204233.60446: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 22690 1727204233.61769: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204233.61961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204233.62188: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204233.62415: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204233.64201: stdout chunk (state=3): >>>PLATFORM <<< 22690 1727204233.64267: stdout chunk (state=3): >>>Linux <<< 22690 1727204233.64294: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 <<< 22690 1727204233.64315: stdout chunk (state=3): >>>ENDFOUND <<< 22690 1727204233.64541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204233.64545: stdout chunk (state=3): >>><<< 22690 1727204233.64547: stderr chunk (state=3): >>><<< 22690 1727204233.64770: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204233.64778 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 22690 1727204233.64782: _low_level_execute_command(): starting 22690 1727204233.64785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 22690 1727204233.65127: Sending initial data 22690 1727204233.65131: Sent initial data (1181 bytes) 22690 1727204233.66189: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204233.66348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204233.66479: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204233.66542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204233.70241: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} <<< 22690 1727204233.70669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204233.71079: stderr chunk (state=3): >>><<< 22690 1727204233.71083: stdout chunk (state=3): >>><<< 22690 1727204233.71087: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"40 (Forty)\"\nID=fedora\nVERSION_ID=40\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f40\"\nPRETTY_NAME=\"Fedora Linux 40 (Forty)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:40\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f40/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=40\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=40\nSUPPORT_END=2025-05-13\n"} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204233.71090: variable 'ansible_facts' from source: unknown 22690 1727204233.71096: variable 'ansible_facts' from source: unknown 22690 1727204233.71112: variable 'ansible_module_compression' from source: unknown 22690 1727204233.71159: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204233.71317: variable 'ansible_facts' from source: unknown 22690 1727204233.71524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py 22690 1727204233.71972: Sending initial data 22690 1727204233.71983: Sent initial data (153 bytes) 22690 1727204233.73764: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204233.73786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204233.74251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204233.74264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204233.74313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204233.74686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204233.76143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204233.76212: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204233.76281: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpk_ncu9j3 /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py <<< 22690 1727204233.76290: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py" <<< 22690 1727204233.76341: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpk_ncu9j3" to remote "/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py" <<< 22690 1727204233.80227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204233.80231: stderr chunk (state=3): >>><<< 22690 1727204233.80234: stdout chunk (state=3): >>><<< 22690 1727204233.80236: done transferring module to remote 22690 1727204233.80238: _low_level_execute_command(): starting 22690 1727204233.80241: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/ /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py && sleep 0' 22690 1727204233.81610: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204233.81648: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204233.81662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204233.81784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204233.82085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204233.82196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204233.84058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204233.84255: stderr chunk (state=3): >>><<< 22690 1727204233.84310: stdout chunk (state=3): >>><<< 22690 1727204233.84340: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204233.84421: _low_level_execute_command(): starting 22690 1727204233.84425: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/AnsiballZ_setup.py && sleep 0' 22690 1727204233.85873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204233.85944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204233.85960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204233.86140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204233.86175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204233.86373: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204233.88713: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 22690 1727204233.88833: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 22690 1727204233.88840: stdout chunk (state=3): >>>import 'posix' # <<< 22690 1727204233.88987: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 22690 1727204233.89161: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7feab0> import '_signal' # <<< 22690 1727204233.89233: stdout chunk (state=3): >>>import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 22690 1727204233.89472: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 22690 1727204233.89475: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6111c0> <<< 22690 1727204233.89538: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204233.89566: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6120c0> <<< 22690 1727204233.89580: stdout chunk (state=3): >>>import 'site' # <<< 22690 1727204233.89661: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22690 1727204233.90162: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22690 1727204233.90170: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22690 1727204233.90264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64ff20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6640b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22690 1727204233.90411: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22690 1727204233.90447: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6878c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 22690 1727204233.90451: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f687f50> import '_collections' # <<< 22690 1727204233.90689: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f667bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f665310> <<< 22690 1727204233.90693: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64d0d0> <<< 22690 1727204233.90818: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 22690 1727204233.90912: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22690 1727204233.90918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22690 1727204233.90922: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22690 1727204233.91025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22690 1727204233.91030: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6ab890> <<< 22690 1727204233.91033: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6aa4b0> <<< 22690 1727204233.91036: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22690 1727204233.91039: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6661e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6a8c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dc8c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64c350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 22690 1727204233.91041: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6dcd70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dcc20> <<< 22690 1727204233.91561: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204233.91567: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6dd010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64ae70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 22690 1727204233.91570: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dd6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dd3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6de5d0> <<< 22690 1727204233.91606: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f4800> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f7dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f7500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6de630> <<< 22690 1727204233.91786: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22690 1727204233.91790: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f43bce0> <<< 22690 1727204233.91793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 22690 1727204233.91796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 22690 1727204233.91802: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f464830> <<< 22690 1727204233.91805: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f464590> <<< 22690 1727204233.91809: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f4647d0> <<< 22690 1727204233.92002: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f4649e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f439e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22690 1727204233.92008: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f466090> <<< 22690 1727204233.92042: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f464d10> <<< 22690 1727204233.92069: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6ded20> <<< 22690 1727204233.92075: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22690 1727204233.92386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f492420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22690 1727204233.92480: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4aa570> <<< 22690 1727204233.92484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22690 1727204233.92511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # <<< 22690 1727204233.92519: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4df290> <<< 22690 1727204233.92590: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22690 1727204233.92596: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22690 1727204233.92779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22690 1727204233.92796: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f509a30> <<< 22690 1727204233.92889: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4df3b0> <<< 22690 1727204233.92893: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4ab200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2f4350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4a95b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f466fc0> <<< 22690 1727204233.93167: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22690 1727204233.93173: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f365f2f4620> <<< 22690 1727204233.93259: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__u6pfvcy/ansible_ansible.legacy.setup_payload.zip' <<< 22690 1727204233.93274: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.93548: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22690 1727204233.93649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f35a120> import '_typing' # <<< 22690 1727204233.94032: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f331010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f330170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 22690 1727204233.95450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.96738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f333fb0> <<< 22690 1727204233.96777: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 22690 1727204233.96811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22690 1727204233.96854: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38db20> <<< 22690 1727204233.96881: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d8b0> <<< 22690 1727204233.96899: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d1c0> <<< 22690 1727204233.96921: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22690 1727204233.96963: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d610> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f35adb0> import 'atexit' # <<< 22690 1727204233.96994: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38e840> <<< 22690 1727204233.97067: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38ea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22690 1727204233.97104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 22690 1727204233.97107: stdout chunk (state=3): >>>import '_locale' # <<< 22690 1727204233.97207: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38ef60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22690 1727204233.97210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22690 1727204233.97236: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f0d10> <<< 22690 1727204233.97299: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f1f2930> <<< 22690 1727204233.97303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 22690 1727204233.97319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22690 1727204233.97344: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f32f0> <<< 22690 1727204233.97408: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22690 1727204233.97500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f4230> <<< 22690 1727204233.97524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22690 1727204233.97541: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f6f90> <<< 22690 1727204233.97756: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f1f70b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f5250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22690 1727204233.97760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22690 1727204233.97763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 22690 1727204233.97767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1faed0> <<< 22690 1727204233.98047: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f99a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f9700> <<< 22690 1727204233.98051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1fbe60> <<< 22690 1727204233.98088: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f5760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f23f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f23f200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22690 1727204233.98131: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f244e00> <<< 22690 1727204233.98181: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f244bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22690 1727204233.98345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f2472f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f245460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22690 1727204233.98405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 22690 1727204233.98440: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f24ea20> <<< 22690 1727204233.98575: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2473b0> <<< 22690 1727204233.98643: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24f890> <<< 22690 1727204233.98678: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24f6e0> <<< 22690 1727204233.98944: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f23f500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f253440> <<< 22690 1727204233.99018: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204233.99024: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f2544a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f251bb0> <<< 22690 1727204233.99086: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f252f60> <<< 22690 1727204233.99090: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2517c0> # zipimport: zlib available <<< 22690 1727204233.99093: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 22690 1727204233.99140: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99209: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99308: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204233.99319: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 22690 1727204233.99332: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99342: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99348: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # <<< 22690 1727204233.99431: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99618: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204233.99627: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.00246: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.01132: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0dc5f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dd3a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f257ad0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 22690 1727204234.01142: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 22690 1727204234.01146: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.01328: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.01481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22690 1727204234.01485: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dd1f0> <<< 22690 1727204234.01656: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.02271: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.02513: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.02593: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.02831: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22690 1727204234.02838: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.03040: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22690 1727204234.03052: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # <<< 22690 1727204234.03061: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.03370: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.03572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22690 1727204234.03784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dfe00> # zipimport: zlib available <<< 22690 1727204234.03788: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.03983: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22690 1727204234.04004: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.04189: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e5f70> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e6840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dec90> <<< 22690 1727204234.04239: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.04287: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22690 1727204234.04294: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.04348: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.04385: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.04677: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e55e0> <<< 22690 1727204234.04683: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0e68d0> <<< 22690 1727204234.04713: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 22690 1727204234.04728: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.04858: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.04883: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.05080: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22690 1727204234.05087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22690 1727204234.05199: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f17eb40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0f0830> <<< 22690 1727204234.05276: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0ee930> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0ee7b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 22690 1727204234.05283: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05337: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # <<< 22690 1727204234.05343: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 22690 1727204234.05562: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.05617: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05639: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.05694: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05701: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05737: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 22690 1727204234.05782: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.05999: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.06003: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 22690 1727204234.06006: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.06373: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.06395: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.06434: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.06578: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 22690 1727204234.06607: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f181550> <<< 22690 1727204234.06615: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22690 1727204234.06639: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 22690 1727204234.06655: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22690 1727204234.06707: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22690 1727204234.06729: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22690 1727204234.06732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 22690 1727204234.06760: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fbfe0> <<< 22690 1727204234.06780: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.06784: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.06811: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6fc380> <<< 22690 1727204234.06970: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1610a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1624e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f183980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f183290> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22690 1727204234.07083: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6ff3b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fec60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.07142: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6fee40> <<< 22690 1727204234.07286: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fe0c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6ff440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22690 1727204234.07296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22690 1727204234.07317: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e769f10> <<< 22690 1727204234.07360: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fff20> <<< 22690 1727204234.07385: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f181640> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 22690 1727204234.07415: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.07475: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 22690 1727204234.07813: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 22690 1727204234.07900: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.07906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 22690 1727204234.07959: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.08003: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 22690 1727204234.08027: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.08199: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.08203: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.08345: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 22690 1727204234.08946: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.09351: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 22690 1727204234.09401: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.09456: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.09494: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.09588: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.09663: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 22690 1727204234.09786: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available <<< 22690 1727204234.09790: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.09820: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22690 1727204234.09824: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10106: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e76a120> <<< 22690 1727204234.10110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22690 1727204234.10194: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22690 1727204234.10264: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e76acc0> import 'ansible.module_utils.facts.system.local' # <<< 22690 1727204234.10273: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10371: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10424: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 22690 1727204234.10643: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22690 1727204234.10651: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10701: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10785: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 22690 1727204234.10789: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10851: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.10879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22690 1727204234.10919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22690 1727204234.10990: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.11086: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e7962d0> <<< 22690 1727204234.11261: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e782ae0> <<< 22690 1727204234.11265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 22690 1727204234.11353: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.11400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22690 1727204234.11505: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.11568: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.11690: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.11848: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 22690 1727204234.11862: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.11953: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 22690 1727204234.11992: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.12057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22690 1727204234.12073: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204234.12111: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e599e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e783290> <<< 22690 1727204234.12123: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 22690 1727204234.12270: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 22690 1727204234.12418: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.12584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22690 1727204234.12720: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.12808: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.12887: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.12919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 22690 1727204234.12943: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.13041: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.13178: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.13277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 22690 1727204234.13292: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.13474: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.13643: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.14269: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.14894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 22690 1727204234.14940: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.15055: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 22690 1727204234.15185: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.15287: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 22690 1727204234.15290: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.15455: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.15689: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 22690 1727204234.15780: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 22690 1727204234.15873: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.15971: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.16223: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.16436: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 22690 1727204234.16478: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.16556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 22690 1727204234.16562: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.16577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 22690 1727204234.16670: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.16731: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 22690 1727204234.16788: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.16853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 22690 1727204234.16996: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.17092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 22690 1727204234.17344: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.17634: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22690 1727204234.17971: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204234.18095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 22690 1727204234.18112: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 22690 1727204234.18310: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 22690 1727204234.18323: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18350: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 22690 1727204234.18353: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18379: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18527: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18658: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22690 1727204234.18662: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18708: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18885: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22690 1727204234.18893: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.18986: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.19268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 22690 1727204234.19272: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.19304: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22690 1727204234.19340: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.19436: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 22690 1727204234.19519: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.19601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 22690 1727204234.19698: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available <<< 22690 1727204234.19710: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.19803: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 22690 1727204234.19832: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22690 1727204234.19979: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204234.20408: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 22690 1727204234.20490: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e5c3830> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e5c1670> <<< 22690 1727204234.20544: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e5c22a0> <<< 22690 1727204234.34263: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 22690 1727204234.34267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e608e30> <<< 22690 1727204234.34387: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e609d90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e65c530> <<< 22690 1727204234.34476: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e65c080> <<< 22690 1727204234.34673: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 22690 1727204234.55118: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "14", "epoch": "1727204234", "epoch_int": "1727204234", "date": "2024-09-24", "time": "14:57:14", "iso8601_micro": "2024-09-24T18:57:14.207186Z", "iso8601": "2024-09-24T18:57:14Z", "iso8601_basic": "20240924T145714207186", "iso8601_basic_short": "20240924T145714", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.4873046875, "5m": 0.498046875, "15m": 0.271484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3474, "used": 242}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 580, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316838400, "block_size": 4096, "block_total": 64479564, "block_available": 61356650, "block_used": 3122914, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204234.55924: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file <<< 22690 1727204234.55934: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor <<< 22690 1727204234.55955: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 22690 1727204234.56096: stdout chunk (state=3): >>># cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd <<< 22690 1727204234.56300: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 22690 1727204234.56470: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22690 1727204234.56497: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 22690 1727204234.56502: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 22690 1727204234.56553: stdout chunk (state=3): >>># destroy ntpath <<< 22690 1727204234.56556: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 22690 1727204234.56559: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings <<< 22690 1727204234.56692: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 22690 1727204234.56961: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata <<< 22690 1727204234.56964: stdout chunk (state=3): >>># destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 22690 1727204234.57072: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 22690 1727204234.57076: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum <<< 22690 1727204234.57232: stdout chunk (state=3): >>># cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 22690 1727204234.57236: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22690 1727204234.57398: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections <<< 22690 1727204234.57402: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 22690 1727204234.57472: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22690 1727204234.57480: stdout chunk (state=3): >>># destroy _typing <<< 22690 1727204234.57518: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 22690 1727204234.57601: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22690 1727204234.57656: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 22690 1727204234.57708: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 22690 1727204234.57793: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 22690 1727204234.58599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204234.58603: stdout chunk (state=3): >>><<< 22690 1727204234.58606: stderr chunk (state=3): >>><<< 22690 1727204234.59090: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f7feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6111c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6120c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64ff20> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6640b0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6878c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f687f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f667bc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f665310> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64d0d0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6ab890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6aa4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6661e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6a8c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dc8c0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64c350> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6dcd70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dcc20> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6dd010> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f64ae70> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dd6d0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6dd3a0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6de5d0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f4800> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f5ee0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f6d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f73e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f62d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f6f7dd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6f7500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6de630> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f43bce0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f464830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f464590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f4647d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f4649e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f439e80> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f466090> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f464d10> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f6ded20> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f492420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4aa570> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4df290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f509a30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4df3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4ab200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2f4350> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f4a95b0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f466fc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f365f2f4620> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__u6pfvcy/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f35a120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f331010> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f330170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f333fb0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38db20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d8b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d1c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38d610> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f35adb0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38e840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f38ea80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f38ef60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f0d10> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f1f2930> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f32f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f4230> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f6f90> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f1f70b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f5250> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1faed0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f99a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f9700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1fbe60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1f5760> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f23f080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f23f200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f244e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f244bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f2472f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f245460> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f24ea20> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2473b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24f890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24f6e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f24fb90> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f23f500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f253440> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f2544a0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f251bb0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f252f60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f2517c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0dc5f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dd3a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f257ad0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dd1f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dfe00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e5f70> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e6840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0dec90> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365f0e55e0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0e68d0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f17eb40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0f0830> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0ee930> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f0ee7b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f181550> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fbfe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6fc380> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1610a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f1624e0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f183980> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f183290> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6ff3b0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fec60> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e6fee40> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fe0c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6ff440> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e769f10> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e6fff20> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365f181640> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e76a120> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e76acc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e7962d0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e782ae0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e599e20> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e783290> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f365e5c3830> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e5c1670> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e5c22a0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e608e30> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e609d90> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e65c530> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f365e65c080> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "14", "epoch": "1727204234", "epoch_int": "1727204234", "date": "2024-09-24", "time": "14:57:14", "iso8601_micro": "2024-09-24T18:57:14.207186Z", "iso8601": "2024-09-24T18:57:14Z", "iso8601_basic": "20240924T145714207186", "iso8601_basic_short": "20240924T145714", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fips": false, "ansible_loadavg": {"1m": 0.4873046875, "5m": 0.498046875, "15m": 0.271484375}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3043, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 673, "free": 3043}, "nocache": {"free": 3474, "used": 242}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 580, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316838400, "block_size": 4096, "block_total": 64479564, "block_available": 61356650, "block_used": 3122914, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 22690 1727204234.62378: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204234.62383: _low_level_execute_command(): starting 22690 1727204234.62386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204232.8365285-22782-52804327140235/ > /dev/null 2>&1 && sleep 0' 22690 1727204234.63475: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204234.63911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204234.63919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204234.63922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204234.63924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204234.65873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204234.65880: stderr chunk (state=3): >>><<< 22690 1727204234.66073: stdout chunk (state=3): >>><<< 22690 1727204234.66077: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204234.66080: handler run complete 22690 1727204234.66242: variable 'ansible_facts' from source: unknown 22690 1727204234.66474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.67309: variable 'ansible_facts' from source: unknown 22690 1727204234.67417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.67640: attempt loop complete, returning result 22690 1727204234.67651: _execute() done 22690 1727204234.67849: dumping result to json 22690 1727204234.67853: done dumping result, returning 22690 1727204234.67855: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-00000000007c] 22690 1727204234.67857: sending task result for task 127b8e07-fff9-78bb-bf56-00000000007c 22690 1727204234.69244: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000007c 22690 1727204234.69250: WORKER PROCESS EXITING ok: [managed-node2] 22690 1727204234.69390: no more pending results, returning what we have 22690 1727204234.69394: results queue empty 22690 1727204234.69395: checking for any_errors_fatal 22690 1727204234.69397: done checking for any_errors_fatal 22690 1727204234.69397: checking for max_fail_percentage 22690 1727204234.69399: done checking for max_fail_percentage 22690 1727204234.69400: checking to see if all hosts have failed and the running result is not ok 22690 1727204234.69401: done checking to see if all hosts have failed 22690 1727204234.69402: getting the remaining hosts for this loop 22690 1727204234.69403: done getting the remaining hosts for this loop 22690 1727204234.69407: getting the next task for host managed-node2 22690 1727204234.69413: done getting next task for host managed-node2 22690 1727204234.69415: ^ task is: TASK: meta (flush_handlers) 22690 1727204234.69417: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204234.69421: getting variables 22690 1727204234.69423: in VariableManager get_vars() 22690 1727204234.69445: Calling all_inventory to load vars for managed-node2 22690 1727204234.69448: Calling groups_inventory to load vars for managed-node2 22690 1727204234.69452: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204234.69579: Calling all_plugins_play to load vars for managed-node2 22690 1727204234.69583: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204234.69587: Calling groups_plugins_play to load vars for managed-node2 22690 1727204234.70052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.70411: done with get_vars() 22690 1727204234.70425: done getting variables 22690 1727204234.70616: in VariableManager get_vars() 22690 1727204234.70630: Calling all_inventory to load vars for managed-node2 22690 1727204234.70633: Calling groups_inventory to load vars for managed-node2 22690 1727204234.70636: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204234.70642: Calling all_plugins_play to load vars for managed-node2 22690 1727204234.70644: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204234.70648: Calling groups_plugins_play to load vars for managed-node2 22690 1727204234.70983: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.71401: done with get_vars() 22690 1727204234.71420: done queuing things up, now waiting for results queue to drain 22690 1727204234.71423: results queue empty 22690 1727204234.71424: checking for any_errors_fatal 22690 1727204234.71427: done checking for any_errors_fatal 22690 1727204234.71428: checking for max_fail_percentage 22690 1727204234.71429: done checking for max_fail_percentage 22690 1727204234.71430: checking to see if all hosts have failed and the running result is not ok 22690 1727204234.71431: done checking to see if all hosts have failed 22690 1727204234.71437: getting the remaining hosts for this loop 22690 1727204234.71438: done getting the remaining hosts for this loop 22690 1727204234.71441: getting the next task for host managed-node2 22690 1727204234.71447: done getting next task for host managed-node2 22690 1727204234.71450: ^ task is: TASK: Include the task 'el_repo_setup.yml' 22690 1727204234.71451: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204234.71454: getting variables 22690 1727204234.71455: in VariableManager get_vars() 22690 1727204234.71590: Calling all_inventory to load vars for managed-node2 22690 1727204234.71593: Calling groups_inventory to load vars for managed-node2 22690 1727204234.71596: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204234.71604: Calling all_plugins_play to load vars for managed-node2 22690 1727204234.71606: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204234.71609: Calling groups_plugins_play to load vars for managed-node2 22690 1727204234.71918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.72226: done with get_vars() 22690 1727204234.72351: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Tuesday 24 September 2024 14:57:14 -0400 (0:00:01.969) 0:00:02.009 ***** 22690 1727204234.72582: entering _queue_task() for managed-node2/include_tasks 22690 1727204234.72584: Creating lock for include_tasks 22690 1727204234.73484: worker is 1 (out of 1 available) 22690 1727204234.73498: exiting _queue_task() for managed-node2/include_tasks 22690 1727204234.73510: done queuing things up, now waiting for results queue to drain 22690 1727204234.73511: waiting for pending results... 22690 1727204234.74003: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 22690 1727204234.74146: in run() - task 127b8e07-fff9-78bb-bf56-000000000006 22690 1727204234.74222: variable 'ansible_search_path' from source: unknown 22690 1727204234.74404: calling self._execute() 22690 1727204234.74471: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204234.74540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204234.74610: variable 'omit' from source: magic vars 22690 1727204234.74855: _execute() done 22690 1727204234.74860: dumping result to json 22690 1727204234.74871: done dumping result, returning 22690 1727204234.74885: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [127b8e07-fff9-78bb-bf56-000000000006] 22690 1727204234.74975: sending task result for task 127b8e07-fff9-78bb-bf56-000000000006 22690 1727204234.75272: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000006 22690 1727204234.75275: WORKER PROCESS EXITING 22690 1727204234.75328: no more pending results, returning what we have 22690 1727204234.75333: in VariableManager get_vars() 22690 1727204234.75374: Calling all_inventory to load vars for managed-node2 22690 1727204234.75378: Calling groups_inventory to load vars for managed-node2 22690 1727204234.75382: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204234.75403: Calling all_plugins_play to load vars for managed-node2 22690 1727204234.75408: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204234.75412: Calling groups_plugins_play to load vars for managed-node2 22690 1727204234.76154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.76350: done with get_vars() 22690 1727204234.76360: variable 'ansible_search_path' from source: unknown 22690 1727204234.76682: we have included files to process 22690 1727204234.76684: generating all_blocks data 22690 1727204234.76686: done generating all_blocks data 22690 1727204234.76687: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22690 1727204234.76688: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22690 1727204234.76691: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22690 1727204234.78285: in VariableManager get_vars() 22690 1727204234.78305: done with get_vars() 22690 1727204234.78319: done processing included file 22690 1727204234.78321: iterating over new_blocks loaded from include file 22690 1727204234.78323: in VariableManager get_vars() 22690 1727204234.78333: done with get_vars() 22690 1727204234.78335: filtering new block on tags 22690 1727204234.78350: done filtering new block on tags 22690 1727204234.78353: in VariableManager get_vars() 22690 1727204234.78364: done with get_vars() 22690 1727204234.78369: filtering new block on tags 22690 1727204234.78388: done filtering new block on tags 22690 1727204234.78390: in VariableManager get_vars() 22690 1727204234.78425: done with get_vars() 22690 1727204234.78427: filtering new block on tags 22690 1727204234.78442: done filtering new block on tags 22690 1727204234.78444: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 22690 1727204234.78451: extending task lists for all hosts with included blocks 22690 1727204234.78506: done extending task lists 22690 1727204234.78508: done processing included files 22690 1727204234.78509: results queue empty 22690 1727204234.78510: checking for any_errors_fatal 22690 1727204234.78512: done checking for any_errors_fatal 22690 1727204234.78512: checking for max_fail_percentage 22690 1727204234.78514: done checking for max_fail_percentage 22690 1727204234.78515: checking to see if all hosts have failed and the running result is not ok 22690 1727204234.78516: done checking to see if all hosts have failed 22690 1727204234.78516: getting the remaining hosts for this loop 22690 1727204234.78517: done getting the remaining hosts for this loop 22690 1727204234.78520: getting the next task for host managed-node2 22690 1727204234.78525: done getting next task for host managed-node2 22690 1727204234.78527: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 22690 1727204234.78530: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204234.78532: getting variables 22690 1727204234.78534: in VariableManager get_vars() 22690 1727204234.78543: Calling all_inventory to load vars for managed-node2 22690 1727204234.78545: Calling groups_inventory to load vars for managed-node2 22690 1727204234.78548: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204234.78554: Calling all_plugins_play to load vars for managed-node2 22690 1727204234.78557: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204234.78560: Calling groups_plugins_play to load vars for managed-node2 22690 1727204234.78713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204234.78916: done with get_vars() 22690 1727204234.78925: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:57:14 -0400 (0:00:00.064) 0:00:02.073 ***** 22690 1727204234.79002: entering _queue_task() for managed-node2/setup 22690 1727204234.79344: worker is 1 (out of 1 available) 22690 1727204234.79356: exiting _queue_task() for managed-node2/setup 22690 1727204234.79372: done queuing things up, now waiting for results queue to drain 22690 1727204234.79373: waiting for pending results... 22690 1727204234.79790: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 22690 1727204234.79796: in run() - task 127b8e07-fff9-78bb-bf56-00000000008d 22690 1727204234.79799: variable 'ansible_search_path' from source: unknown 22690 1727204234.79801: variable 'ansible_search_path' from source: unknown 22690 1727204234.79826: calling self._execute() 22690 1727204234.79910: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204234.79923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204234.79938: variable 'omit' from source: magic vars 22690 1727204234.80505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204234.82840: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204234.82943: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204234.82991: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204234.83041: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204234.83077: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204234.83178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204234.83218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204234.83256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204234.83305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204234.83324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204234.83538: variable 'ansible_facts' from source: unknown 22690 1727204234.83623: variable 'network_test_required_facts' from source: task vars 22690 1727204234.83673: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 22690 1727204234.83686: variable 'omit' from source: magic vars 22690 1727204234.83731: variable 'omit' from source: magic vars 22690 1727204234.83773: variable 'omit' from source: magic vars 22690 1727204234.83811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204234.83846: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204234.83891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204234.83896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204234.83912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204234.83946: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204234.84000: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204234.84004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204234.84078: Set connection var ansible_connection to ssh 22690 1727204234.84094: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204234.84112: Set connection var ansible_pipelining to False 22690 1727204234.84119: Set connection var ansible_shell_type to sh 22690 1727204234.84130: Set connection var ansible_shell_executable to /bin/sh 22690 1727204234.84142: Set connection var ansible_timeout to 10 22690 1727204234.84172: variable 'ansible_shell_executable' from source: unknown 22690 1727204234.84216: variable 'ansible_connection' from source: unknown 22690 1727204234.84220: variable 'ansible_module_compression' from source: unknown 22690 1727204234.84222: variable 'ansible_shell_type' from source: unknown 22690 1727204234.84225: variable 'ansible_shell_executable' from source: unknown 22690 1727204234.84227: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204234.84229: variable 'ansible_pipelining' from source: unknown 22690 1727204234.84231: variable 'ansible_timeout' from source: unknown 22690 1727204234.84233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204234.84382: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204234.84400: variable 'omit' from source: magic vars 22690 1727204234.84433: starting attempt loop 22690 1727204234.84436: running the handler 22690 1727204234.84438: _low_level_execute_command(): starting 22690 1727204234.84445: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204234.85203: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204234.85287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204234.85346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204234.85374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204234.85389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204234.85699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204234.87451: stdout chunk (state=3): >>>/root <<< 22690 1727204234.87775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204234.87780: stdout chunk (state=3): >>><<< 22690 1727204234.87782: stderr chunk (state=3): >>><<< 22690 1727204234.87786: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204234.87799: _low_level_execute_command(): starting 22690 1727204234.87801: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904 `" && echo ansible-tmp-1727204234.8767338-22827-93331224858904="` echo /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904 `" ) && sleep 0' 22690 1727204234.88954: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204234.88982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204234.89084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204234.89109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204234.89136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204234.89152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204234.89305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204234.91416: stdout chunk (state=3): >>>ansible-tmp-1727204234.8767338-22827-93331224858904=/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904 <<< 22690 1727204234.91573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204234.91712: stderr chunk (state=3): >>><<< 22690 1727204234.91724: stdout chunk (state=3): >>><<< 22690 1727204234.91751: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204234.8767338-22827-93331224858904=/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204234.91978: variable 'ansible_module_compression' from source: unknown 22690 1727204234.92091: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204234.92171: variable 'ansible_facts' from source: unknown 22690 1727204234.92594: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py 22690 1727204234.93109: Sending initial data 22690 1727204234.93112: Sent initial data (153 bytes) 22690 1727204234.94438: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204234.94613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204234.94653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204234.94737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204234.94808: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204234.94924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204234.96623: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204234.96628: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204234.96796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpe25pmagl /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py <<< 22690 1727204234.96801: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py" <<< 22690 1727204234.96834: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpe25pmagl" to remote "/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py" <<< 22690 1727204235.01843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204235.01847: stdout chunk (state=3): >>><<< 22690 1727204235.01850: stderr chunk (state=3): >>><<< 22690 1727204235.01854: done transferring module to remote 22690 1727204235.01856: _low_level_execute_command(): starting 22690 1727204235.01859: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/ /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py && sleep 0' 22690 1727204235.03698: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204235.03787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204235.04120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204235.04459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204235.07192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204235.07475: stderr chunk (state=3): >>><<< 22690 1727204235.07488: stdout chunk (state=3): >>><<< 22690 1727204235.07519: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204235.07624: _low_level_execute_command(): starting 22690 1727204235.07636: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/AnsiballZ_setup.py && sleep 0' 22690 1727204235.09956: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204235.09989: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204235.10010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204235.10059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204235.10273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204235.13488: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22690 1727204235.13539: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 22690 1727204235.13620: stdout chunk (state=3): >>>import '_io' # <<< 22690 1727204235.13682: stdout chunk (state=3): >>>import 'marshal' # import 'posix' # <<< 22690 1727204235.13790: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 22690 1727204235.13823: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.13879: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 22690 1727204235.13973: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 22690 1727204235.13993: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7374c0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73748fb30> <<< 22690 1727204235.14018: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7374c2ab0> <<< 22690 1727204235.14077: stdout chunk (state=3): >>>import '_signal' # import '_abc' # import 'abc' # <<< 22690 1727204235.14130: stdout chunk (state=3): >>>import 'io' # <<< 22690 1727204235.14258: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # <<< 22690 1727204235.14294: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 22690 1727204235.14375: stdout chunk (state=3): >>>import 'os' # <<< 22690 1727204235.14392: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 22690 1727204235.14395: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 22690 1727204235.14571: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages'<<< 22690 1727204235.14583: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372b5190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372b6090> <<< 22690 1727204235.14602: stdout chunk (state=3): >>>import 'site' # <<< 22690 1727204235.14635: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22690 1727204235.15392: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.15405: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22690 1727204235.15434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22690 1727204235.15466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22690 1727204235.15502: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f3f80> <<< 22690 1727204235.15604: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737308110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22690 1727204235.15630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 22690 1727204235.15767: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 22690 1727204235.16133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73732b950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73732bfe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73730bc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737309370> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f1130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22690 1727204235.16137: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22690 1727204235.16163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22690 1727204235.16181: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22690 1727204235.16213: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734f8c0> <<< 22690 1727204235.16238: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734e4e0> <<< 22690 1727204235.16269: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73730a210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734cd70> <<< 22690 1727204235.16328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22690 1727204235.16558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737c980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f03b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73737ce30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737cce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73737d0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372eeed0> <<< 22690 1727204235.16564: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.16711: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737d760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737d430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737e660> <<< 22690 1727204235.16720: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22690 1727204235.16894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737398890> import 'errno' # <<< 22690 1727204235.16906: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737399fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22690 1727204235.17013: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739ae40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73739b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739a390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 22690 1727204235.17020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22690 1727204235.17168: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73739be60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739b590> <<< 22690 1727204235.17172: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737e6c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22690 1727204235.17370: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370cfd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fc890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fc5f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fc8c0> <<< 22690 1727204235.17373: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fcaa0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370cdeb0> <<< 22690 1727204235.17419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22690 1727204235.17494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22690 1727204235.17554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fe0f0> <<< 22690 1727204235.17690: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fcd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737edb0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22690 1727204235.17703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22690 1727204235.17726: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7371224b0> <<< 22690 1727204235.17800: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.17996: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713e5d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22690 1727204235.18019: stdout chunk (state=3): >>>import 'ntpath' # <<< 22690 1727204235.18126: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737177350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22690 1727204235.18211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22690 1727204235.18253: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73719daf0> <<< 22690 1727204235.18330: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737177470> <<< 22690 1727204235.18391: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713f260> <<< 22690 1727204235.18409: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736f74440> <<< 22690 1727204235.18443: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713d610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370ff050> <<< 22690 1727204235.18587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22690 1727204235.18651: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb73713d3a0> <<< 22690 1727204235.18827: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_gtso7g_t/ansible_setup_payload.zip' # zipimport: zlib available <<< 22690 1727204235.18987: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22690 1727204235.19022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22690 1727204235.19119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22690 1727204235.19180: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fe20c0> import '_typing' # <<< 22690 1727204235.19377: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fb8fb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fb8140> # zipimport: zlib available <<< 22690 1727204235.19417: stdout chunk (state=3): >>>import 'ansible' # <<< 22690 1727204235.19455: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.19484: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 22690 1727204235.22057: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.23466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fbbf20> <<< 22690 1727204235.23494: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737011a60> <<< 22690 1727204235.23525: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370117f0> <<< 22690 1727204235.23592: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737011100> <<< 22690 1727204235.23702: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737011550> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fe2ae0> <<< 22690 1727204235.23727: stdout chunk (state=3): >>>import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737012810> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.23739: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737012a20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22690 1727204235.23886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 22690 1727204235.23921: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737012f60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 22690 1727204235.23935: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22690 1727204235.23973: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e74ce0> <<< 22690 1727204235.24101: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736e76900> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 22690 1727204235.24137: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e77260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22690 1727204235.24160: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e78440> <<< 22690 1727204235.24178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22690 1727204235.24236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22690 1727204235.24248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22690 1727204235.24363: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7af00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736e7aff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e791c0> <<< 22690 1727204235.24430: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22690 1727204235.24454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22690 1727204235.24481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22690 1727204235.24569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7ee70> import '_tokenize' # <<< 22690 1727204235.24608: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7d940> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7d6a0> <<< 22690 1727204235.24661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22690 1727204235.24740: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7ffb0> <<< 22690 1727204235.24853: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e796d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ec2f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ec30b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 22690 1727204235.24907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22690 1727204235.24999: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ecccb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecca70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22690 1727204235.25078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22690 1727204235.25121: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ecf1d0> <<< 22690 1727204235.25175: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecd3a0> <<< 22690 1727204235.25179: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22690 1727204235.25275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 22690 1727204235.25300: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed2960> <<< 22690 1727204235.25643: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecf2f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3a70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3b60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ec33b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22690 1727204235.25647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22690 1727204235.25669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22690 1727204235.25694: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.25708: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed7230> <<< 22690 1727204235.26084: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed8740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed59d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed6d80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed5670> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22690 1727204235.26278: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.26320: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 22690 1727204235.26349: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22690 1727204235.26352: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.26563: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.26777: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.27931: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.28809: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22690 1727204235.28843: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22690 1727204235.28858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.28893: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.28912: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d5c950> <<< 22690 1727204235.29022: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5d730> <<< 22690 1727204235.29026: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736edb200> <<< 22690 1727204235.29161: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22690 1727204235.29164: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.29170: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.29172: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22690 1727204235.29298: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.29478: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22690 1727204235.29504: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5d490> # zipimport: zlib available <<< 22690 1727204235.30016: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.30808: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.30919: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.31042: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22690 1727204235.31055: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.31117: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.31205: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22690 1727204235.31283: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.31437: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22690 1727204235.31467: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.31476: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 22690 1727204235.31573: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.31605: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22690 1727204235.31619: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.32141: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.32592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22690 1727204235.32698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 22690 1727204235.32803: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5fa40> <<< 22690 1727204235.32930: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.33064: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22690 1727204235.33078: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 22690 1727204235.33099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22690 1727204235.33213: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.33402: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d66150> <<< 22690 1727204235.33469: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.33513: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d66a20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed3e30> # zipimport: zlib available <<< 22690 1727204235.33577: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.33649: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22690 1727204235.33652: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.33720: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.33874: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.33878: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.33989: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22690 1727204235.34046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.34194: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d65820> <<< 22690 1727204235.34240: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d66ba0> <<< 22690 1727204235.34279: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 22690 1727204235.34296: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.34401: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.34504: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.34558: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.34605: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.34664: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22690 1727204235.34667: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22690 1727204235.34688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22690 1727204235.34800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22690 1727204235.34958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfacc0> <<< 22690 1727204235.34981: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d70a40> <<< 22690 1727204235.35102: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d6eae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d6e930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 22690 1727204235.35150: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.35190: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22690 1727204235.35285: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 22690 1727204235.35292: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.35332: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 22690 1727204235.35430: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.35555: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.35590: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.35653: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.35769: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.35826: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 22690 1727204235.35847: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.35971: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.36108: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.36136: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.36193: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 22690 1727204235.36512: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.36888: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.36964: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204235.36994: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 22690 1727204235.37019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 22690 1727204235.37036: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 22690 1727204235.37078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 22690 1727204235.37105: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfdb20> <<< 22690 1727204235.37142: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22690 1727204235.37175: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22690 1727204235.37240: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22690 1727204235.37262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22690 1727204235.37301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633c320> <<< 22690 1727204235.37332: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.37433: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633c6e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ddd370> <<< 22690 1727204235.37452: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ddc2c0> <<< 22690 1727204235.37503: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfc200> <<< 22690 1727204235.37518: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dffcb0> <<< 22690 1727204235.37538: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 22690 1727204235.37604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22690 1727204235.37637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 22690 1727204235.37676: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 22690 1727204235.37683: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 22690 1727204235.37727: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633eed0> <<< 22690 1727204235.37788: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633f080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633e330> <<< 22690 1727204235.37801: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22690 1727204235.37965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 22690 1727204235.38002: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633f740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22690 1727204235.38047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22690 1727204235.38128: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.38133: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363a6240> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a4260> <<< 22690 1727204235.38166: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dffec0> <<< 22690 1727204235.38175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 22690 1727204235.38191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 22690 1727204235.38212: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38228: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38233: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other' # <<< 22690 1727204235.38351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.38456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 22690 1727204235.38473: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38568: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38633: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 22690 1727204235.38653: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38657: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38692: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # <<< 22690 1727204235.38695: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38731: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 22690 1727204235.38789: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38869: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.38946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 22690 1727204235.38949: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39076: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 22690 1727204235.39119: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39186: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39289: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39378: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.39478: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 22690 1727204235.39494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # <<< 22690 1727204235.39870: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.40419: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22690 1727204235.41243: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41332: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41419: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41461: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41518: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 22690 1727204235.41526: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # <<< 22690 1727204235.41541: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41577: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41623: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 22690 1727204235.41629: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41723: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41804: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 22690 1727204235.41826: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41868: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41910: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22690 1727204235.41926: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.41973: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.42009: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 22690 1727204235.42025: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.42154: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.42293: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 22690 1727204235.42299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22690 1727204235.42334: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a7ef0> <<< 22690 1727204235.42359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22690 1727204235.42415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22690 1727204235.42631: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a6fc0> <<< 22690 1727204235.42638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 22690 1727204235.42757: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.42853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 22690 1727204235.42877: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.43024: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.43184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22690 1727204235.43187: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.43291: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.43431: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 22690 1727204235.43491: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.43553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22690 1727204235.43622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22690 1727204235.43726: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204235.43836: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363da570> <<< 22690 1727204235.44177: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363c2d20> <<< 22690 1727204235.44183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 22690 1727204235.44281: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.44369: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22690 1727204235.44379: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.44525: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.44663: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.44873: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45121: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 22690 1727204235.45125: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 22690 1727204235.45138: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45206: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45252: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 22690 1727204235.45276: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45336: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45415: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22690 1727204235.45485: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363f6060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363f7a40> <<< 22690 1727204235.45497: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 22690 1727204235.45516: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45553: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 22690 1727204235.45601: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45662: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 22690 1727204235.45767: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.45960: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22690 1727204235.46224: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46418: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46575: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46644: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 22690 1727204235.46704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 22690 1727204235.46726: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46748: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.46786: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.47029: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.47295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 22690 1727204235.47512: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.47739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 22690 1727204235.47840: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.48853: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.49805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 22690 1727204235.49810: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.49997: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.50181: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 22690 1727204235.50189: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.50529: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # <<< 22690 1727204235.50534: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.50805: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51085: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22690 1727204235.51094: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51113: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 22690 1727204235.51135: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51206: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 22690 1727204235.51278: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51445: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51618: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.51997: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 22690 1727204235.52384: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 22690 1727204235.52392: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52447: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52496: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 22690 1727204235.52512: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52541: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 22690 1727204235.52591: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52710: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52827: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 22690 1727204235.52836: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.52901: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 22690 1727204235.52913: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.53004: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.53095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 22690 1727204235.53153: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.53204: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.53294: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 22690 1727204235.53309: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.53794: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54286: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22690 1727204235.54295: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54475: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # <<< 22690 1727204235.54481: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54536: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54579: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 22690 1727204235.54595: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54638: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 22690 1727204235.54699: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54751: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 22690 1727204235.54814: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.54943: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55074: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 22690 1727204235.55104: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.55109: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 22690 1727204235.55194: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.55263: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 22690 1727204235.55268: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55300: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55326: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55408: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55483: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55608: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.55737: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 22690 1727204235.55740: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22690 1727204235.55831: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204235.55907: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22690 1727204235.55926: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56282: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56631: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22690 1727204235.56640: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56714: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22690 1727204235.56806: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56873: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.56947: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22690 1727204235.56961: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.57156: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.57236: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 22690 1727204235.57242: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 22690 1727204235.57354: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.57398: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.57553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 22690 1727204235.57562: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22690 1727204235.57678: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204235.59146: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 22690 1727204235.59182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 22690 1727204235.59243: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc'<<< 22690 1727204235.59251: stdout chunk (state=3): >>> <<< 22690 1727204235.59287: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so'<<< 22690 1727204235.59325: stdout chunk (state=3): >>> import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73621ffb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73621f5f0> <<< 22690 1727204235.59439: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73621e330><<< 22690 1727204235.59445: stdout chunk (state=3): >>> <<< 22690 1727204235.59934: stdout chunk (state=3): >>> <<< 22690 1727204235.59959: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "15", "epoch": "1727204235", "epoch_int": "1727204235", "date": "2024-09-24", "time": "14:57:15", "iso8601_micro": "2024-09-24T18:57:15.584672Z", "iso8601": "2024-09-24T18:57:15Z", "iso8601_basic": "20240924T145715584672", "iso8601_basic_short": "20240924T145715", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "versio<<< 22690 1727204235.60083: stdout chunk (state=3): >>>n_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204235.60795: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 22690 1727204235.60803: stdout chunk (state=3): >>># clear sys.path_hooks <<< 22690 1727204235.60835: stdout chunk (state=3): >>># clear builtins._ # clear sys.path <<< 22690 1727204235.60842: stdout chunk (state=3): >>># clear sys.argv # clear sys.ps1 <<< 22690 1727204235.60866: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 22690 1727204235.60883: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout<<< 22690 1727204235.60886: stdout chunk (state=3): >>> # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread<<< 22690 1727204235.60912: stdout chunk (state=3): >>> # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs<<< 22690 1727204235.60929: stdout chunk (state=3): >>> # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc<<< 22690 1727204235.60948: stdout chunk (state=3): >>> # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath<<< 22690 1727204235.60962: stdout chunk (state=3): >>> # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site<<< 22690 1727204235.60983: stdout chunk (state=3): >>> # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib<<< 22690 1727204235.61005: stdout chunk (state=3): >>> # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser<<< 22690 1727204235.61025: stdout chunk (state=3): >>> # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii <<< 22690 1727204235.61035: stdout chunk (state=3): >>># cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util<<< 22690 1727204235.61059: stdout chunk (state=3): >>> # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression<<< 22690 1727204235.61073: stdout chunk (state=3): >>> # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect<<< 22690 1727204235.61089: stdout chunk (state=3): >>> # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random<<< 22690 1727204235.61100: stdout chunk (state=3): >>> # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath<<< 22690 1727204235.61128: stdout chunk (state=3): >>> # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path<<< 22690 1727204235.61138: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil<<< 22690 1727204235.61175: stdout chunk (state=3): >>> # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils<<< 22690 1727204235.61202: stdout chunk (state=3): >>> # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess<<< 22690 1727204235.61232: stdout chunk (state=3): >>> # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd<<< 22690 1727204235.61240: stdout chunk (state=3): >>> # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string<<< 22690 1727204235.61268: stdout chunk (state=3): >>> # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array<<< 22690 1727204235.61282: stdout chunk (state=3): >>> # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common<<< 22690 1727204235.61304: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc <<< 22690 1727204235.61312: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text<<< 22690 1727204235.61340: stdout chunk (state=3): >>> # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors<<< 22690 1727204235.61349: stdout chunk (state=3): >>> # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast<<< 22690 1727204235.61374: stdout chunk (state=3): >>> # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation <<< 22690 1727204235.61385: stdout chunk (state=3): >>># destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib<<< 22690 1727204235.61404: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux<<< 22690 1727204235.61420: stdout chunk (state=3): >>> # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext<<< 22690 1727204235.61437: stdout chunk (state=3): >>> # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info<<< 22690 1727204235.61454: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing<<< 22690 1727204235.61479: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing<<< 22690 1727204235.61491: stdout chunk (state=3): >>> # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection<<< 22690 1727204235.61512: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps<<< 22690 1727204235.61526: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time<<< 22690 1727204235.61675: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 22690 1727204235.62128: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 22690 1727204235.62139: stdout chunk (state=3): >>> <<< 22690 1727204235.62177: stdout chunk (state=3): >>># destroy importlib.machinery <<< 22690 1727204235.62181: stdout chunk (state=3): >>># destroy importlib._abc <<< 22690 1727204235.62185: stdout chunk (state=3): >>># destroy importlib.util <<< 22690 1727204235.62215: stdout chunk (state=3): >>># destroy _bz2<<< 22690 1727204235.62236: stdout chunk (state=3): >>> <<< 22690 1727204235.62239: stdout chunk (state=3): >>># destroy _compression <<< 22690 1727204235.62247: stdout chunk (state=3): >>># destroy _lzma<<< 22690 1727204235.62253: stdout chunk (state=3): >>> <<< 22690 1727204235.62289: stdout chunk (state=3): >>># destroy binascii<<< 22690 1727204235.62292: stdout chunk (state=3): >>> <<< 22690 1727204235.62300: stdout chunk (state=3): >>># destroy zlib<<< 22690 1727204235.62306: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma # destroy zipfile._path<<< 22690 1727204235.62339: stdout chunk (state=3): >>> # destroy zipfile <<< 22690 1727204235.62349: stdout chunk (state=3): >>># destroy pathlib<<< 22690 1727204235.62354: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy ipaddress<<< 22690 1727204235.62405: stdout chunk (state=3): >>> # destroy ntpath <<< 22690 1727204235.62435: stdout chunk (state=3): >>># destroy importlib <<< 22690 1727204235.62442: stdout chunk (state=3): >>># destroy zipimport <<< 22690 1727204235.62460: stdout chunk (state=3): >>># destroy __main__ <<< 22690 1727204235.62486: stdout chunk (state=3): >>># destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder <<< 22690 1727204235.62509: stdout chunk (state=3): >>># destroy json.encoder # destroy json.scanner # destroy _json # destroy grp<<< 22690 1727204235.62540: stdout chunk (state=3): >>> # destroy encodings # destroy _locale <<< 22690 1727204235.62570: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal<<< 22690 1727204235.62575: stdout chunk (state=3): >>> # destroy _posixsubprocess<<< 22690 1727204235.62598: stdout chunk (state=3): >>> # destroy syslog # destroy uuid<<< 22690 1727204235.62667: stdout chunk (state=3): >>> # destroy _hashlib<<< 22690 1727204235.62682: stdout chunk (state=3): >>> # destroy _blake2<<< 22690 1727204235.62703: stdout chunk (state=3): >>> # destroy selinux <<< 22690 1727204235.62714: stdout chunk (state=3): >>># destroy shutil<<< 22690 1727204235.62718: stdout chunk (state=3): >>> <<< 22690 1727204235.62732: stdout chunk (state=3): >>># destroy distro<<< 22690 1727204235.62795: stdout chunk (state=3): >>> # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 22690 1727204235.62817: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector <<< 22690 1727204235.62830: stdout chunk (state=3): >>># destroy multiprocessing <<< 22690 1727204235.62850: stdout chunk (state=3): >>># destroy multiprocessing.connection <<< 22690 1727204235.62861: stdout chunk (state=3): >>># destroy multiprocessing.pool # destroy signal<<< 22690 1727204235.62870: stdout chunk (state=3): >>> # destroy pickle<<< 22690 1727204235.62901: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle <<< 22690 1727204235.62925: stdout chunk (state=3): >>># destroy queue<<< 22690 1727204235.62939: stdout chunk (state=3): >>> # destroy _heapq<<< 22690 1727204235.62958: stdout chunk (state=3): >>> # destroy _queue<<< 22690 1727204235.62974: stdout chunk (state=3): >>> # destroy multiprocessing.process # destroy unicodedata<<< 22690 1727204235.62981: stdout chunk (state=3): >>> # destroy tempfile # destroy multiprocessing.util<<< 22690 1727204235.62999: stdout chunk (state=3): >>> # destroy multiprocessing.reduction # destroy selectors <<< 22690 1727204235.63023: stdout chunk (state=3): >>># destroy _multiprocessing # destroy shlex<<< 22690 1727204235.63033: stdout chunk (state=3): >>> <<< 22690 1727204235.63047: stdout chunk (state=3): >>># destroy fcntl # destroy datetime<<< 22690 1727204235.63066: stdout chunk (state=3): >>> <<< 22690 1727204235.63071: stdout chunk (state=3): >>># destroy subprocess # destroy base64<<< 22690 1727204235.63102: stdout chunk (state=3): >>> # destroy _ssl <<< 22690 1727204235.63137: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 22690 1727204235.63149: stdout chunk (state=3): >>> # destroy getpass<<< 22690 1727204235.63158: stdout chunk (state=3): >>> # destroy pwd <<< 22690 1727204235.63173: stdout chunk (state=3): >>># destroy termios # destroy errno<<< 22690 1727204235.63212: stdout chunk (state=3): >>> # destroy json # destroy socket <<< 22690 1727204235.63221: stdout chunk (state=3): >>># destroy struct <<< 22690 1727204235.63241: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 22690 1727204235.63251: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector<<< 22690 1727204235.63323: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.idna <<< 22690 1727204235.63328: stdout chunk (state=3): >>># destroy stringprep <<< 22690 1727204235.63349: stdout chunk (state=3): >>># cleanup[3] wiping configparser <<< 22690 1727204235.63355: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 22690 1727204235.63376: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc<<< 22690 1727204235.63406: stdout chunk (state=3): >>> # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 22690 1727204235.63418: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal<<< 22690 1727204235.63422: stdout chunk (state=3): >>> # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 22690 1727204235.63442: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize<<< 22690 1727204235.63448: stdout chunk (state=3): >>> # cleanup[3] wiping _tokenize<<< 22690 1727204235.63471: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing<<< 22690 1727204235.63491: stdout chunk (state=3): >>> # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib<<< 22690 1727204235.63511: stdout chunk (state=3): >>> # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 22690 1727204235.63523: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random<<< 22690 1727204235.63539: stdout chunk (state=3): >>> # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 22690 1727204235.63546: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 22690 1727204235.63567: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum<<< 22690 1727204235.63574: stdout chunk (state=3): >>> # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 22690 1727204235.63594: stdout chunk (state=3): >>> # cleanup[3] wiping _sre<<< 22690 1727204235.63606: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools<<< 22690 1727204235.63629: stdout chunk (state=3): >>> # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections<<< 22690 1727204235.63639: stdout chunk (state=3): >>> # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 22690 1727204235.63656: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath<<< 22690 1727204235.63677: stdout chunk (state=3): >>> # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc<<< 22690 1727204235.63701: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs<<< 22690 1727204235.63715: stdout chunk (state=3): >>> # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix<<< 22690 1727204235.63737: stdout chunk (state=3): >>> # cleanup[3] wiping marshal<<< 22690 1727204235.63743: stdout chunk (state=3): >>> # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread<<< 22690 1727204235.63757: stdout chunk (state=3): >>> # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins<<< 22690 1727204235.63777: stdout chunk (state=3): >>> # destroy selinux._selinux <<< 22690 1727204235.63917: stdout chunk (state=3): >>># destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22690 1727204235.63989: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22690 1727204235.63992: stdout chunk (state=3): >>># destroy _socket <<< 22690 1727204235.64015: stdout chunk (state=3): >>># destroy _collections <<< 22690 1727204235.64059: stdout chunk (state=3): >>># destroy platform <<< 22690 1727204235.64063: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 22690 1727204235.64082: stdout chunk (state=3): >>># destroy tokenize <<< 22690 1727204235.64100: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22690 1727204235.64222: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22690 1727204235.64355: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 22690 1727204235.64359: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 22690 1727204235.64378: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 22690 1727204235.64424: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 22690 1727204235.64460: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 22690 1727204235.64464: stdout chunk (state=3): >>># clear sys.audit hooks <<< 22690 1727204235.65067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204235.65071: stdout chunk (state=3): >>><<< 22690 1727204235.65074: stderr chunk (state=3): >>><<< 22690 1727204235.65108: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7374c0530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73748fb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7374c2ab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372b5190> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372b6090> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f3f80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737308110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73732b950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73732bfe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73730bc20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737309370> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f1130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734f8c0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734e4e0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73730a210> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73734cd70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737c980> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372f03b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73737ce30> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737cce0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73737d0a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7372eeed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737d760> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737d430> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737e660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737398890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737399fa0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739ae40> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73739b4a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739a390> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73739be60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73739b590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737e6c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370cfd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fc890> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fc5f0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fc8c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7370fcaa0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370cdeb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fe0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370fcd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73737edb0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7371224b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713e5d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737177350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73719daf0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737177470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713f260> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736f74440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73713d610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370ff050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fb73713d3a0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_gtso7g_t/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fe20c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fb8fb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fb8140> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fbbf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737011a60> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7370117f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737011100> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737011550> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736fe2ae0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737012810> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb737012a20> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb737012f60> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e74ce0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736e76900> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e77260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e78440> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7af00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736e7aff0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e791c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7ee70> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7d940> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7d6a0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e7ffb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736e796d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ec2f30> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ec30b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ecccb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecca70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ecf1d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecd3a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed2960> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ecf2f0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3a70> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3b60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed3bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ec33b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed7230> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed8740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed59d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736ed6d80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed5670> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d5c950> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5d730> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736edb200> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5d490> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d5fa40> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d66150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d66a20> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ed3e30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb736d65820> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d66ba0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfacc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d70a40> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d6eae0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736d6e930> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfdb20> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633c320> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633c6e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ddd370> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736ddc2c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dfc200> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dffcb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633f5f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633eed0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73633f080> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633e330> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73633f740> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363a6240> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a4260> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb736dffec0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a7ef0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363a6fc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363da570> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363c2d20> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb7363f6060> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb7363f7a40> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fb73621ffb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73621f5f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fb73621e330> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "15", "epoch": "1727204235", "epoch_int": "1727204235", "date": "2024-09-24", "time": "14:57:15", "iso8601_micro": "2024-09-24T18:57:15.584672Z", "iso8601": "2024-09-24T18:57:15Z", "iso8601_basic": "20240924T145715584672", "iso8601_basic_short": "20240924T145715", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22690 1727204235.66181: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204235.66185: _low_level_execute_command(): starting 22690 1727204235.66187: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204234.8767338-22827-93331224858904/ > /dev/null 2>&1 && sleep 0' 22690 1727204235.66190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204235.66192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204235.66194: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204235.66197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204235.66199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204235.66202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204235.66238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204235.66242: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204235.66324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204235.68642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204235.68647: stdout chunk (state=3): >>><<< 22690 1727204235.68650: stderr chunk (state=3): >>><<< 22690 1727204235.68682: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204235.68688: handler run complete 22690 1727204235.68748: variable 'ansible_facts' from source: unknown 22690 1727204235.68874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204235.68997: variable 'ansible_facts' from source: unknown 22690 1727204235.69032: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204235.69072: attempt loop complete, returning result 22690 1727204235.69075: _execute() done 22690 1727204235.69084: dumping result to json 22690 1727204235.69096: done dumping result, returning 22690 1727204235.69107: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [127b8e07-fff9-78bb-bf56-00000000008d] 22690 1727204235.69110: sending task result for task 127b8e07-fff9-78bb-bf56-00000000008d 22690 1727204235.69263: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000008d 22690 1727204235.69269: WORKER PROCESS EXITING ok: [managed-node2] 22690 1727204235.69425: no more pending results, returning what we have 22690 1727204235.69429: results queue empty 22690 1727204235.69429: checking for any_errors_fatal 22690 1727204235.69431: done checking for any_errors_fatal 22690 1727204235.69432: checking for max_fail_percentage 22690 1727204235.69433: done checking for max_fail_percentage 22690 1727204235.69434: checking to see if all hosts have failed and the running result is not ok 22690 1727204235.69435: done checking to see if all hosts have failed 22690 1727204235.69435: getting the remaining hosts for this loop 22690 1727204235.69437: done getting the remaining hosts for this loop 22690 1727204235.69440: getting the next task for host managed-node2 22690 1727204235.69450: done getting next task for host managed-node2 22690 1727204235.69452: ^ task is: TASK: Check if system is ostree 22690 1727204235.69455: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204235.69458: getting variables 22690 1727204235.69459: in VariableManager get_vars() 22690 1727204235.69489: Calling all_inventory to load vars for managed-node2 22690 1727204235.69492: Calling groups_inventory to load vars for managed-node2 22690 1727204235.69495: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204235.69506: Calling all_plugins_play to load vars for managed-node2 22690 1727204235.69508: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204235.69511: Calling groups_plugins_play to load vars for managed-node2 22690 1727204235.69631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204235.69751: done with get_vars() 22690 1727204235.69759: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:57:15 -0400 (0:00:00.908) 0:00:02.981 ***** 22690 1727204235.69837: entering _queue_task() for managed-node2/stat 22690 1727204235.70084: worker is 1 (out of 1 available) 22690 1727204235.70097: exiting _queue_task() for managed-node2/stat 22690 1727204235.70110: done queuing things up, now waiting for results queue to drain 22690 1727204235.70112: waiting for pending results... 22690 1727204235.70272: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 22690 1727204235.70333: in run() - task 127b8e07-fff9-78bb-bf56-00000000008f 22690 1727204235.70349: variable 'ansible_search_path' from source: unknown 22690 1727204235.70352: variable 'ansible_search_path' from source: unknown 22690 1727204235.70389: calling self._execute() 22690 1727204235.70454: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204235.70460: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204235.70475: variable 'omit' from source: magic vars 22690 1727204235.70977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204235.71473: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204235.71477: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204235.71481: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204235.71506: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204235.71643: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204235.71679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204235.71731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204235.71826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204235.71946: Evaluated conditional (not __network_is_ostree is defined): True 22690 1727204235.71958: variable 'omit' from source: magic vars 22690 1727204235.72013: variable 'omit' from source: magic vars 22690 1727204235.72077: variable 'omit' from source: magic vars 22690 1727204235.72112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204235.72163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204235.72192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204235.72261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204235.72265: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204235.72281: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204235.72289: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204235.72296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204235.72414: Set connection var ansible_connection to ssh 22690 1727204235.72434: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204235.72470: Set connection var ansible_pipelining to False 22690 1727204235.72474: Set connection var ansible_shell_type to sh 22690 1727204235.72485: Set connection var ansible_shell_executable to /bin/sh 22690 1727204235.72487: Set connection var ansible_timeout to 10 22690 1727204235.72515: variable 'ansible_shell_executable' from source: unknown 22690 1727204235.72571: variable 'ansible_connection' from source: unknown 22690 1727204235.72575: variable 'ansible_module_compression' from source: unknown 22690 1727204235.72579: variable 'ansible_shell_type' from source: unknown 22690 1727204235.72581: variable 'ansible_shell_executable' from source: unknown 22690 1727204235.72583: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204235.72585: variable 'ansible_pipelining' from source: unknown 22690 1727204235.72596: variable 'ansible_timeout' from source: unknown 22690 1727204235.72598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204235.72820: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204235.72824: variable 'omit' from source: magic vars 22690 1727204235.72826: starting attempt loop 22690 1727204235.72829: running the handler 22690 1727204235.72831: _low_level_execute_command(): starting 22690 1727204235.72926: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204235.73859: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204235.73885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204235.73982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204235.74042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204235.74144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204235.75807: stdout chunk (state=3): >>>/root <<< 22690 1727204235.76325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204235.76331: stdout chunk (state=3): >>><<< 22690 1727204235.76335: stderr chunk (state=3): >>><<< 22690 1727204235.76340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204235.76352: _low_level_execute_command(): starting 22690 1727204235.76355: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855 `" && echo ansible-tmp-1727204235.762173-22866-46856302630855="` echo /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855 `" ) && sleep 0' 22690 1727204235.77557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204235.77578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204235.77592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204235.77613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204235.77724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204235.77750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204235.77858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204235.79829: stdout chunk (state=3): >>>ansible-tmp-1727204235.762173-22866-46856302630855=/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855 <<< 22690 1727204235.80015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204235.80026: stdout chunk (state=3): >>><<< 22690 1727204235.80040: stderr chunk (state=3): >>><<< 22690 1727204235.80067: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204235.762173-22866-46856302630855=/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204235.80371: variable 'ansible_module_compression' from source: unknown 22690 1727204235.80374: ANSIBALLZ: Using lock for stat 22690 1727204235.80377: ANSIBALLZ: Acquiring lock 22690 1727204235.80379: ANSIBALLZ: Lock acquired: 139846653778672 22690 1727204235.80381: ANSIBALLZ: Creating module 22690 1727204236.18275: ANSIBALLZ: Writing module into payload 22690 1727204236.18450: ANSIBALLZ: Writing module 22690 1727204236.18530: ANSIBALLZ: Renaming module 22690 1727204236.18606: ANSIBALLZ: Done creating module 22690 1727204236.19047: variable 'ansible_facts' from source: unknown 22690 1727204236.19051: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py 22690 1727204236.19789: Sending initial data 22690 1727204236.19793: Sent initial data (151 bytes) 22690 1727204236.21192: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204236.21520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204236.21588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204236.23216: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204236.23276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204236.23337: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp4yf7e40b /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py <<< 22690 1727204236.23346: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py" <<< 22690 1727204236.23431: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp4yf7e40b" to remote "/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py" <<< 22690 1727204236.25094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204236.25416: stdout chunk (state=3): >>><<< 22690 1727204236.25421: stderr chunk (state=3): >>><<< 22690 1727204236.25423: done transferring module to remote 22690 1727204236.25425: _low_level_execute_command(): starting 22690 1727204236.25428: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/ /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py && sleep 0' 22690 1727204236.26648: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204236.26706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204236.26884: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204236.26921: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204236.27011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204236.29907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204236.29911: stderr chunk (state=3): >>><<< 22690 1727204236.29914: stdout chunk (state=3): >>><<< 22690 1727204236.29934: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204236.30030: _low_level_execute_command(): starting 22690 1727204236.30034: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/AnsiballZ_stat.py && sleep 0' 22690 1727204236.31275: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204236.31387: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204236.31402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204236.31413: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204236.31605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204236.31609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204236.31719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204236.35149: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22690 1727204236.35177: stdout chunk (state=3): >>>import _imp # builtin <<< 22690 1727204236.35231: stdout chunk (state=3): >>>import '_thread' # <<< 22690 1727204236.35251: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 22690 1727204236.35381: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 22690 1727204236.35439: stdout chunk (state=3): >>>import 'posix' # <<< 22690 1727204236.35490: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 22690 1727204236.35564: stdout chunk (state=3): >>># installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 22690 1727204236.35664: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 22690 1727204236.35681: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 22690 1727204236.35702: stdout chunk (state=3): >>>import 'codecs' # <<< 22690 1727204236.35963: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 22690 1727204236.36119: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22690 1727204236.36158: stdout chunk (state=3): >>>import 'genericpath' # <<< 22690 1727204236.36236: stdout chunk (state=3): >>>import 'posixpath' # import 'os' # <<< 22690 1727204236.36250: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 22690 1727204236.36312: stdout chunk (state=3): >>>Processing user site-packages <<< 22690 1727204236.36316: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 22690 1727204236.36604: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 22690 1727204236.36608: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3d20c0> <<< 22690 1727204236.36632: stdout chunk (state=3): >>>import 'site' # <<< 22690 1727204236.36690: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22690 1727204236.37275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22690 1727204236.37302: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40ffe0> <<< 22690 1727204236.37329: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 22690 1727204236.37368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 22690 1727204236.37408: stdout chunk (state=3): >>>import '_operator' # <<< 22690 1727204236.37425: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d424170> <<< 22690 1727204236.37456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22690 1727204236.37615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22690 1727204236.37618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204236.37659: stdout chunk (state=3): >>>import 'itertools' # <<< 22690 1727204236.37712: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 22690 1727204236.37715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' <<< 22690 1727204236.37718: stdout chunk (state=3): >>>import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4479b0> <<< 22690 1727204236.37741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 22690 1727204236.37960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d447f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d427c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4253d0> <<< 22690 1727204236.38068: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40d190> <<< 22690 1727204236.38106: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22690 1727204236.38138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22690 1727204236.38161: stdout chunk (state=3): >>>import '_sre' # <<< 22690 1727204236.38204: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22690 1727204236.38241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22690 1727204236.38282: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22690 1727204236.38298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22690 1727204236.38352: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d46b980> <<< 22690 1727204236.38374: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d46a5a0> <<< 22690 1727204236.38420: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22690 1727204236.38571: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4262a0> <<< 22690 1727204236.38585: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d468d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d498a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40c410> <<< 22690 1727204236.38612: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 22690 1727204236.38626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 22690 1727204236.38869: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d498ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d498d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d499160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40af30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4997f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4994f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py <<< 22690 1727204236.38906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49a6f0> <<< 22690 1727204236.38993: stdout chunk (state=3): >>>import 'importlib.util' # <<< 22690 1727204236.39012: stdout chunk (state=3): >>>import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 22690 1727204236.39043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 22690 1727204236.39107: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 22690 1727204236.39112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 22690 1727204236.39195: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b48f0> <<< 22690 1727204236.39199: stdout chunk (state=3): >>>import 'errno' # <<< 22690 1727204236.39237: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b6030> <<< 22690 1727204236.39241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 22690 1727204236.39274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22690 1727204236.39302: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 22690 1727204236.39343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 22690 1727204236.39347: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b6ed0> <<< 22690 1727204236.39437: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.39664: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b6420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b7ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b75f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49a660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22690 1727204236.39693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22690 1727204236.39745: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.40205: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d293d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bc830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bc590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bc860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bca40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d291ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 22690 1727204236.40209: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22690 1727204236.40245: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22690 1727204236.40265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22690 1727204236.40289: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2be0f0> <<< 22690 1727204236.40327: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bcd70> <<< 22690 1727204236.40370: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49ae10> <<< 22690 1727204236.40780: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2e64b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 22690 1727204236.40852: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3025a0> <<< 22690 1727204236.40888: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22690 1727204236.40952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22690 1727204236.41055: stdout chunk (state=3): >>>import 'ntpath' # <<< 22690 1727204236.41093: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 22690 1727204236.41113: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d33b350> <<< 22690 1727204236.41143: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22690 1727204236.41212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22690 1727204236.41243: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22690 1727204236.41315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22690 1727204236.41471: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d361af0> <<< 22690 1727204236.41660: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d33b470> <<< 22690 1727204236.41772: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d303200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1404a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3015e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bf050> <<< 22690 1727204236.41884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 22690 1727204236.41912: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa27d140740> <<< 22690 1727204236.42043: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_kgmue2oy/ansible_stat_payload.zip' <<< 22690 1727204236.42061: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.42381: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.42403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22690 1727204236.42452: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22690 1727204236.42761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d19a270> import '_typing' # <<< 22690 1727204236.42937: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d171160> <<< 22690 1727204236.42962: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1702f0> <<< 22690 1727204236.43093: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.43110: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 22690 1727204236.43139: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.45669: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.47929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d173680> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22690 1727204236.47934: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22690 1727204236.48067: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c5d00> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c5a90> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c53a0> <<< 22690 1727204236.48102: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 22690 1727204236.48116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 22690 1727204236.48178: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c57f0> <<< 22690 1727204236.48212: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d19af00> <<< 22690 1727204236.48216: stdout chunk (state=3): >>>import 'atexit' # <<< 22690 1727204236.48244: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.48272: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c6ab0> <<< 22690 1727204236.48310: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.48561: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c6cc0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c7140> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 22690 1727204236.48606: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d028e30> <<< 22690 1727204236.48637: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.48770: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.48814: stdout chunk (state=3): >>>import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d02aa50> <<< 22690 1727204236.48818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02b3e0> <<< 22690 1727204236.48831: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22690 1727204236.48877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22690 1727204236.48903: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02c2f0> <<< 22690 1727204236.48943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22690 1727204236.49005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22690 1727204236.49043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 22690 1727204236.49061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22690 1727204236.49153: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02f050> <<< 22690 1727204236.49227: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.49278: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d02f350> <<< 22690 1727204236.49284: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02d310> <<< 22690 1727204236.49331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22690 1727204236.49357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22690 1727204236.49475: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 22690 1727204236.49495: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 22690 1727204236.49674: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d032fc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d031ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d031820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22690 1727204236.49801: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d033fe0> <<< 22690 1727204236.49841: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02d820> <<< 22690 1727204236.49888: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.50073: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07b1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07cdd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07cb90> <<< 22690 1727204236.50086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22690 1727204236.50244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22690 1727204236.50332: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.50352: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07f2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07d4c0> <<< 22690 1727204236.50398: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22690 1727204236.50469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204236.50511: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 22690 1727204236.50535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22690 1727204236.50560: stdout chunk (state=3): >>>import '_string' # <<< 22690 1727204236.50620: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d086ab0> <<< 22690 1727204236.50841: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07f440> <<< 22690 1727204236.50962: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.50965: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.51000: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087860> <<< 22690 1727204236.51167: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07b4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22690 1727204236.51186: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.51232: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08b560> <<< 22690 1727204236.51930: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08caa0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d089cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08b080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d0898e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204236.51934: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.51936: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 22690 1727204236.51940: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 22690 1727204236.52076: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.52103: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.52978: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.53722: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22690 1727204236.53745: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 22690 1727204236.53762: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22690 1727204236.53809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22690 1727204236.53864: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d114c50> <<< 22690 1727204236.53968: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22690 1727204236.53986: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d115940> <<< 22690 1727204236.54067: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d08ff80> import 'ansible.module_utils.compat.selinux' # <<< 22690 1727204236.54074: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.54096: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.54121: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22690 1727204236.54376: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.54636: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22690 1727204236.54669: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1157c0> # zipimport: zlib available <<< 22690 1727204236.55561: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.56493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 22690 1727204236.56497: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.56503: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204236.56536: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available <<< 22690 1727204236.56540: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.56687: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 22690 1727204236.57043: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.57161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22690 1727204236.57269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 22690 1727204236.57357: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d116c00> # zipimport: zlib available <<< 22690 1727204236.57671: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.57699: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22690 1727204236.57827: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf22300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf22c90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d088110> # zipimport: zlib available <<< 22690 1727204236.57843: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.57877: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22690 1727204236.57893: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.58432: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf21a30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf22ed0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available <<< 22690 1727204236.58469: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.58634: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22690 1727204236.58667: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22690 1727204236.58690: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22690 1727204236.58694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22690 1727204236.58895: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cfb2e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf2cb90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf2aea0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf21880> # destroy ansible.module_utils.distro <<< 22690 1727204236.58899: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 22690 1727204236.58981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22690 1727204236.59014: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 22690 1727204236.59036: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22690 1727204236.59049: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 22690 1727204236.59249: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.59407: stdout chunk (state=3): >>># zipimport: zlib available <<< 22690 1727204236.59524: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 22690 1727204236.59559: stdout chunk (state=3): >>># destroy __main__ <<< 22690 1727204236.59883: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat <<< 22690 1727204236.59929: stdout chunk (state=3): >>># cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 <<< 22690 1727204236.59942: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 22690 1727204236.60005: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 22690 1727204236.60235: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22690 1727204236.60264: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma <<< 22690 1727204236.60290: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 22690 1727204236.60428: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 22690 1727204236.60437: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array <<< 22690 1727204236.60461: stdout chunk (state=3): >>># destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 22690 1727204236.60517: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 22690 1727204236.60557: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 22690 1727204236.60592: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 22690 1727204236.60671: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 22690 1727204236.60690: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22690 1727204236.60803: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22690 1727204236.60833: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 22690 1727204236.60892: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 22690 1727204236.60900: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing <<< 22690 1727204236.60935: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22690 1727204236.61077: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 22690 1727204236.61106: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 22690 1727204236.61109: stdout chunk (state=3): >>># clear sys.audit hooks <<< 22690 1727204236.61546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204236.61550: stdout chunk (state=3): >>><<< 22690 1727204236.61552: stderr chunk (state=3): >>><<< 22690 1727204236.61821: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5fc530> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5cbb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d5feab0> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3d11c0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3d20c0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40ffe0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d424170> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4479b0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d447f80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d427c50> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4253d0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40d190> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d46b980> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d46a5a0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4262a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d468d70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d498a10> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40c410> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d498ec0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d498d70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d499160> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d40af30> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4997f0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4994f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49a6f0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b48f0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b6030> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b6ed0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b7500> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b6420> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d4b7ec0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d4b75f0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49a660> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d293d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bc830> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bc590> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bc860> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d2bca40> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d291ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2be0f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bcd70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d49ae10> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2e64b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3025a0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d33b350> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d361af0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d33b470> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d303200> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1404a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d3015e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d2bf050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fa27d140740> # zipimport: found 30 names in '/tmp/ansible_stat_payload_kgmue2oy/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d19a270> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d171160> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1702f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d173680> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c5d00> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c5a90> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c53a0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c57f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d19af00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c6ab0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d1c6cc0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1c7140> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d028e30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d02aa50> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02b3e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02c2f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02f050> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d02f350> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02d310> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d032fc0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d031ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d031820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d033fe0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d02d820> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07b050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07b1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07cdd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07cb90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d07f2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07d4c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d086ab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07f440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087860> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d087da0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d07b4d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08b560> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08caa0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d089cd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d08b080> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d0898e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27d114c50> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d115940> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d08ff80> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d1157c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d116c00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf22300> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf22c90> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27d088110> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fa27cf21a30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf22ed0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cfb2e40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf2cb90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf2aea0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fa27cf21880> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22690 1727204236.63182: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204236.63186: _low_level_execute_command(): starting 22690 1727204236.63189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204235.762173-22866-46856302630855/ > /dev/null 2>&1 && sleep 0' 22690 1727204236.63686: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204236.63703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204236.63741: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204236.63843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204236.64042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204236.67074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204236.67079: stdout chunk (state=3): >>><<< 22690 1727204236.67083: stderr chunk (state=3): >>><<< 22690 1727204236.67086: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204236.67088: handler run complete 22690 1727204236.67091: attempt loop complete, returning result 22690 1727204236.67093: _execute() done 22690 1727204236.67096: dumping result to json 22690 1727204236.67098: done dumping result, returning 22690 1727204236.67100: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [127b8e07-fff9-78bb-bf56-00000000008f] 22690 1727204236.67103: sending task result for task 127b8e07-fff9-78bb-bf56-00000000008f 22690 1727204236.67299: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000008f 22690 1727204236.67303: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22690 1727204236.67400: no more pending results, returning what we have 22690 1727204236.67404: results queue empty 22690 1727204236.67404: checking for any_errors_fatal 22690 1727204236.67412: done checking for any_errors_fatal 22690 1727204236.67413: checking for max_fail_percentage 22690 1727204236.67415: done checking for max_fail_percentage 22690 1727204236.67416: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.67417: done checking to see if all hosts have failed 22690 1727204236.67418: getting the remaining hosts for this loop 22690 1727204236.67419: done getting the remaining hosts for this loop 22690 1727204236.67424: getting the next task for host managed-node2 22690 1727204236.67432: done getting next task for host managed-node2 22690 1727204236.67436: ^ task is: TASK: Set flag to indicate system is ostree 22690 1727204236.67438: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.67442: getting variables 22690 1727204236.67444: in VariableManager get_vars() 22690 1727204236.67478: Calling all_inventory to load vars for managed-node2 22690 1727204236.67481: Calling groups_inventory to load vars for managed-node2 22690 1727204236.67485: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.67499: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.67503: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.67506: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.67870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.68106: done with get_vars() 22690 1727204236.68121: done getting variables 22690 1727204236.68201: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.983) 0:00:03.965 ***** 22690 1727204236.68230: entering _queue_task() for managed-node2/set_fact 22690 1727204236.68232: Creating lock for set_fact 22690 1727204236.68504: worker is 1 (out of 1 available) 22690 1727204236.68519: exiting _queue_task() for managed-node2/set_fact 22690 1727204236.68533: done queuing things up, now waiting for results queue to drain 22690 1727204236.68534: waiting for pending results... 22690 1727204236.68778: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 22690 1727204236.68784: in run() - task 127b8e07-fff9-78bb-bf56-000000000090 22690 1727204236.68787: variable 'ansible_search_path' from source: unknown 22690 1727204236.68790: variable 'ansible_search_path' from source: unknown 22690 1727204236.68814: calling self._execute() 22690 1727204236.68880: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.68885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.68894: variable 'omit' from source: magic vars 22690 1727204236.69272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204236.69461: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204236.69508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204236.69542: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204236.69574: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204236.69688: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204236.69713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204236.69734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204236.69764: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204236.69892: Evaluated conditional (not __network_is_ostree is defined): True 22690 1727204236.69898: variable 'omit' from source: magic vars 22690 1727204236.69937: variable 'omit' from source: magic vars 22690 1727204236.70237: variable '__ostree_booted_stat' from source: set_fact 22690 1727204236.70241: variable 'omit' from source: magic vars 22690 1727204236.70243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204236.70246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204236.70256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204236.70274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204236.70285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204236.70490: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204236.70493: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.70498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.70600: Set connection var ansible_connection to ssh 22690 1727204236.70612: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204236.70620: Set connection var ansible_pipelining to False 22690 1727204236.70623: Set connection var ansible_shell_type to sh 22690 1727204236.70629: Set connection var ansible_shell_executable to /bin/sh 22690 1727204236.70638: Set connection var ansible_timeout to 10 22690 1727204236.70671: variable 'ansible_shell_executable' from source: unknown 22690 1727204236.70675: variable 'ansible_connection' from source: unknown 22690 1727204236.70913: variable 'ansible_module_compression' from source: unknown 22690 1727204236.70920: variable 'ansible_shell_type' from source: unknown 22690 1727204236.70923: variable 'ansible_shell_executable' from source: unknown 22690 1727204236.70926: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.70928: variable 'ansible_pipelining' from source: unknown 22690 1727204236.70930: variable 'ansible_timeout' from source: unknown 22690 1727204236.70932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.71024: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204236.71028: variable 'omit' from source: magic vars 22690 1727204236.71050: starting attempt loop 22690 1727204236.71053: running the handler 22690 1727204236.71056: handler run complete 22690 1727204236.71059: attempt loop complete, returning result 22690 1727204236.71061: _execute() done 22690 1727204236.71064: dumping result to json 22690 1727204236.71147: done dumping result, returning 22690 1727204236.71152: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [127b8e07-fff9-78bb-bf56-000000000090] 22690 1727204236.71154: sending task result for task 127b8e07-fff9-78bb-bf56-000000000090 22690 1727204236.71239: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000090 22690 1727204236.71242: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 22690 1727204236.71310: no more pending results, returning what we have 22690 1727204236.71313: results queue empty 22690 1727204236.71314: checking for any_errors_fatal 22690 1727204236.71325: done checking for any_errors_fatal 22690 1727204236.71325: checking for max_fail_percentage 22690 1727204236.71327: done checking for max_fail_percentage 22690 1727204236.71328: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.71329: done checking to see if all hosts have failed 22690 1727204236.71329: getting the remaining hosts for this loop 22690 1727204236.71331: done getting the remaining hosts for this loop 22690 1727204236.71335: getting the next task for host managed-node2 22690 1727204236.71343: done getting next task for host managed-node2 22690 1727204236.71346: ^ task is: TASK: Fix CentOS6 Base repo 22690 1727204236.71348: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.71352: getting variables 22690 1727204236.71354: in VariableManager get_vars() 22690 1727204236.71603: Calling all_inventory to load vars for managed-node2 22690 1727204236.71606: Calling groups_inventory to load vars for managed-node2 22690 1727204236.71611: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.71628: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.71631: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.71643: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.72087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.72474: done with get_vars() 22690 1727204236.72488: done getting variables 22690 1727204236.72655: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.044) 0:00:04.010 ***** 22690 1727204236.72692: entering _queue_task() for managed-node2/copy 22690 1727204236.73051: worker is 1 (out of 1 available) 22690 1727204236.73067: exiting _queue_task() for managed-node2/copy 22690 1727204236.73081: done queuing things up, now waiting for results queue to drain 22690 1727204236.73082: waiting for pending results... 22690 1727204236.73476: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 22690 1727204236.73481: in run() - task 127b8e07-fff9-78bb-bf56-000000000092 22690 1727204236.73485: variable 'ansible_search_path' from source: unknown 22690 1727204236.73487: variable 'ansible_search_path' from source: unknown 22690 1727204236.73572: calling self._execute() 22690 1727204236.73634: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.73646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.73663: variable 'omit' from source: magic vars 22690 1727204236.74353: variable 'ansible_distribution' from source: facts 22690 1727204236.74447: Evaluated conditional (ansible_distribution == 'CentOS'): False 22690 1727204236.74452: when evaluation is False, skipping this task 22690 1727204236.74455: _execute() done 22690 1727204236.74457: dumping result to json 22690 1727204236.74459: done dumping result, returning 22690 1727204236.74462: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [127b8e07-fff9-78bb-bf56-000000000092] 22690 1727204236.74463: sending task result for task 127b8e07-fff9-78bb-bf56-000000000092 22690 1727204236.74708: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000092 22690 1727204236.74712: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 22690 1727204236.74929: no more pending results, returning what we have 22690 1727204236.74933: results queue empty 22690 1727204236.74934: checking for any_errors_fatal 22690 1727204236.74938: done checking for any_errors_fatal 22690 1727204236.74939: checking for max_fail_percentage 22690 1727204236.74940: done checking for max_fail_percentage 22690 1727204236.74941: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.74942: done checking to see if all hosts have failed 22690 1727204236.74942: getting the remaining hosts for this loop 22690 1727204236.74944: done getting the remaining hosts for this loop 22690 1727204236.74947: getting the next task for host managed-node2 22690 1727204236.74953: done getting next task for host managed-node2 22690 1727204236.74956: ^ task is: TASK: Include the task 'enable_epel.yml' 22690 1727204236.74959: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.74962: getting variables 22690 1727204236.74963: in VariableManager get_vars() 22690 1727204236.74993: Calling all_inventory to load vars for managed-node2 22690 1727204236.74996: Calling groups_inventory to load vars for managed-node2 22690 1727204236.74999: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.75009: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.75012: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.75014: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.75503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.75809: done with get_vars() 22690 1727204236.75820: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.032) 0:00:04.042 ***** 22690 1727204236.75963: entering _queue_task() for managed-node2/include_tasks 22690 1727204236.76536: worker is 1 (out of 1 available) 22690 1727204236.76551: exiting _queue_task() for managed-node2/include_tasks 22690 1727204236.76562: done queuing things up, now waiting for results queue to drain 22690 1727204236.76563: waiting for pending results... 22690 1727204236.76794: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 22690 1727204236.76855: in run() - task 127b8e07-fff9-78bb-bf56-000000000093 22690 1727204236.76877: variable 'ansible_search_path' from source: unknown 22690 1727204236.76885: variable 'ansible_search_path' from source: unknown 22690 1727204236.76945: calling self._execute() 22690 1727204236.77046: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.77058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.77123: variable 'omit' from source: magic vars 22690 1727204236.77798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204236.80716: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204236.80825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204236.80887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204236.80970: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204236.80989: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204236.81186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204236.81196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204236.81201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204236.81231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204236.81250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204236.81398: variable '__network_is_ostree' from source: set_fact 22690 1727204236.81431: Evaluated conditional (not __network_is_ostree | d(false)): True 22690 1727204236.81471: _execute() done 22690 1727204236.81474: dumping result to json 22690 1727204236.81477: done dumping result, returning 22690 1727204236.81480: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [127b8e07-fff9-78bb-bf56-000000000093] 22690 1727204236.81482: sending task result for task 127b8e07-fff9-78bb-bf56-000000000093 22690 1727204236.81699: no more pending results, returning what we have 22690 1727204236.81705: in VariableManager get_vars() 22690 1727204236.81859: Calling all_inventory to load vars for managed-node2 22690 1727204236.81863: Calling groups_inventory to load vars for managed-node2 22690 1727204236.81875: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.81889: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.81892: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.81896: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.82258: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000093 22690 1727204236.82262: WORKER PROCESS EXITING 22690 1727204236.82316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.82641: done with get_vars() 22690 1727204236.82651: variable 'ansible_search_path' from source: unknown 22690 1727204236.82652: variable 'ansible_search_path' from source: unknown 22690 1727204236.82704: we have included files to process 22690 1727204236.82705: generating all_blocks data 22690 1727204236.82707: done generating all_blocks data 22690 1727204236.82715: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22690 1727204236.82717: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22690 1727204236.82720: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22690 1727204236.83658: done processing included file 22690 1727204236.83661: iterating over new_blocks loaded from include file 22690 1727204236.83663: in VariableManager get_vars() 22690 1727204236.83685: done with get_vars() 22690 1727204236.83687: filtering new block on tags 22690 1727204236.83718: done filtering new block on tags 22690 1727204236.83722: in VariableManager get_vars() 22690 1727204236.83734: done with get_vars() 22690 1727204236.83736: filtering new block on tags 22690 1727204236.83749: done filtering new block on tags 22690 1727204236.83751: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 22690 1727204236.83757: extending task lists for all hosts with included blocks 22690 1727204236.83897: done extending task lists 22690 1727204236.83898: done processing included files 22690 1727204236.83899: results queue empty 22690 1727204236.83900: checking for any_errors_fatal 22690 1727204236.83904: done checking for any_errors_fatal 22690 1727204236.83905: checking for max_fail_percentage 22690 1727204236.83906: done checking for max_fail_percentage 22690 1727204236.83907: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.83908: done checking to see if all hosts have failed 22690 1727204236.83909: getting the remaining hosts for this loop 22690 1727204236.83910: done getting the remaining hosts for this loop 22690 1727204236.83912: getting the next task for host managed-node2 22690 1727204236.83916: done getting next task for host managed-node2 22690 1727204236.83919: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 22690 1727204236.83922: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.83924: getting variables 22690 1727204236.83925: in VariableManager get_vars() 22690 1727204236.83939: Calling all_inventory to load vars for managed-node2 22690 1727204236.83942: Calling groups_inventory to load vars for managed-node2 22690 1727204236.83944: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.83950: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.83959: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.83962: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.84161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.84388: done with get_vars() 22690 1727204236.84399: done getting variables 22690 1727204236.84491: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204236.84729: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 40] ********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.088) 0:00:04.131 ***** 22690 1727204236.84791: entering _queue_task() for managed-node2/command 22690 1727204236.84793: Creating lock for command 22690 1727204236.85266: worker is 1 (out of 1 available) 22690 1727204236.85281: exiting _queue_task() for managed-node2/command 22690 1727204236.85293: done queuing things up, now waiting for results queue to drain 22690 1727204236.85295: waiting for pending results... 22690 1727204236.85541: running TaskExecutor() for managed-node2/TASK: Create EPEL 40 22690 1727204236.85682: in run() - task 127b8e07-fff9-78bb-bf56-0000000000ad 22690 1727204236.85707: variable 'ansible_search_path' from source: unknown 22690 1727204236.85740: variable 'ansible_search_path' from source: unknown 22690 1727204236.85773: calling self._execute() 22690 1727204236.85878: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.85890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.85958: variable 'omit' from source: magic vars 22690 1727204236.86401: variable 'ansible_distribution' from source: facts 22690 1727204236.86426: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22690 1727204236.86441: when evaluation is False, skipping this task 22690 1727204236.86455: _execute() done 22690 1727204236.86464: dumping result to json 22690 1727204236.86476: done dumping result, returning 22690 1727204236.86501: done running TaskExecutor() for managed-node2/TASK: Create EPEL 40 [127b8e07-fff9-78bb-bf56-0000000000ad] 22690 1727204236.86504: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000ad 22690 1727204236.86794: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000ad 22690 1727204236.86797: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22690 1727204236.86862: no more pending results, returning what we have 22690 1727204236.86882: results queue empty 22690 1727204236.86884: checking for any_errors_fatal 22690 1727204236.86886: done checking for any_errors_fatal 22690 1727204236.86886: checking for max_fail_percentage 22690 1727204236.86888: done checking for max_fail_percentage 22690 1727204236.86889: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.86890: done checking to see if all hosts have failed 22690 1727204236.86891: getting the remaining hosts for this loop 22690 1727204236.86892: done getting the remaining hosts for this loop 22690 1727204236.86897: getting the next task for host managed-node2 22690 1727204236.86904: done getting next task for host managed-node2 22690 1727204236.86971: ^ task is: TASK: Install yum-utils package 22690 1727204236.86981: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.86986: getting variables 22690 1727204236.86992: in VariableManager get_vars() 22690 1727204236.87135: Calling all_inventory to load vars for managed-node2 22690 1727204236.87139: Calling groups_inventory to load vars for managed-node2 22690 1727204236.87142: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.87154: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.87157: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.87160: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.87380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.87648: done with get_vars() 22690 1727204236.87660: done getting variables 22690 1727204236.87784: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.030) 0:00:04.161 ***** 22690 1727204236.87818: entering _queue_task() for managed-node2/package 22690 1727204236.87820: Creating lock for package 22690 1727204236.88301: worker is 1 (out of 1 available) 22690 1727204236.88313: exiting _queue_task() for managed-node2/package 22690 1727204236.88325: done queuing things up, now waiting for results queue to drain 22690 1727204236.88326: waiting for pending results... 22690 1727204236.88594: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 22690 1727204236.88672: in run() - task 127b8e07-fff9-78bb-bf56-0000000000ae 22690 1727204236.88776: variable 'ansible_search_path' from source: unknown 22690 1727204236.88780: variable 'ansible_search_path' from source: unknown 22690 1727204236.88783: calling self._execute() 22690 1727204236.88851: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.88864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.88892: variable 'omit' from source: magic vars 22690 1727204236.89338: variable 'ansible_distribution' from source: facts 22690 1727204236.89363: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22690 1727204236.89373: when evaluation is False, skipping this task 22690 1727204236.89380: _execute() done 22690 1727204236.89388: dumping result to json 22690 1727204236.89396: done dumping result, returning 22690 1727204236.89407: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [127b8e07-fff9-78bb-bf56-0000000000ae] 22690 1727204236.89416: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000ae skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22690 1727204236.89716: no more pending results, returning what we have 22690 1727204236.89721: results queue empty 22690 1727204236.89722: checking for any_errors_fatal 22690 1727204236.89730: done checking for any_errors_fatal 22690 1727204236.89731: checking for max_fail_percentage 22690 1727204236.89733: done checking for max_fail_percentage 22690 1727204236.89733: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.89734: done checking to see if all hosts have failed 22690 1727204236.89735: getting the remaining hosts for this loop 22690 1727204236.89736: done getting the remaining hosts for this loop 22690 1727204236.89741: getting the next task for host managed-node2 22690 1727204236.89751: done getting next task for host managed-node2 22690 1727204236.89753: ^ task is: TASK: Enable EPEL 7 22690 1727204236.89758: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.89761: getting variables 22690 1727204236.89763: in VariableManager get_vars() 22690 1727204236.89799: Calling all_inventory to load vars for managed-node2 22690 1727204236.89802: Calling groups_inventory to load vars for managed-node2 22690 1727204236.89806: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.89822: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.89826: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.89829: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.90182: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000ae 22690 1727204236.90191: WORKER PROCESS EXITING 22690 1727204236.90215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.90424: done with get_vars() 22690 1727204236.90436: done getting variables 22690 1727204236.90500: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.027) 0:00:04.188 ***** 22690 1727204236.90540: entering _queue_task() for managed-node2/command 22690 1727204236.90979: worker is 1 (out of 1 available) 22690 1727204236.90992: exiting _queue_task() for managed-node2/command 22690 1727204236.91003: done queuing things up, now waiting for results queue to drain 22690 1727204236.91005: waiting for pending results... 22690 1727204236.91205: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 22690 1727204236.91342: in run() - task 127b8e07-fff9-78bb-bf56-0000000000af 22690 1727204236.91363: variable 'ansible_search_path' from source: unknown 22690 1727204236.91375: variable 'ansible_search_path' from source: unknown 22690 1727204236.91427: calling self._execute() 22690 1727204236.91525: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.91538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.91554: variable 'omit' from source: magic vars 22690 1727204236.91985: variable 'ansible_distribution' from source: facts 22690 1727204236.92005: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22690 1727204236.92013: when evaluation is False, skipping this task 22690 1727204236.92021: _execute() done 22690 1727204236.92027: dumping result to json 22690 1727204236.92047: done dumping result, returning 22690 1727204236.92052: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [127b8e07-fff9-78bb-bf56-0000000000af] 22690 1727204236.92157: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000af 22690 1727204236.92235: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000af 22690 1727204236.92239: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22690 1727204236.92313: no more pending results, returning what we have 22690 1727204236.92318: results queue empty 22690 1727204236.92319: checking for any_errors_fatal 22690 1727204236.92328: done checking for any_errors_fatal 22690 1727204236.92329: checking for max_fail_percentage 22690 1727204236.92331: done checking for max_fail_percentage 22690 1727204236.92332: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.92333: done checking to see if all hosts have failed 22690 1727204236.92334: getting the remaining hosts for this loop 22690 1727204236.92335: done getting the remaining hosts for this loop 22690 1727204236.92340: getting the next task for host managed-node2 22690 1727204236.92348: done getting next task for host managed-node2 22690 1727204236.92351: ^ task is: TASK: Enable EPEL 8 22690 1727204236.92356: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.92362: getting variables 22690 1727204236.92364: in VariableManager get_vars() 22690 1727204236.92404: Calling all_inventory to load vars for managed-node2 22690 1727204236.92408: Calling groups_inventory to load vars for managed-node2 22690 1727204236.92412: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.92429: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.92434: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.92438: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.92923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.93152: done with get_vars() 22690 1727204236.93162: done getting variables 22690 1727204236.93236: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.027) 0:00:04.216 ***** 22690 1727204236.93273: entering _queue_task() for managed-node2/command 22690 1727204236.93710: worker is 1 (out of 1 available) 22690 1727204236.93723: exiting _queue_task() for managed-node2/command 22690 1727204236.93735: done queuing things up, now waiting for results queue to drain 22690 1727204236.93737: waiting for pending results... 22690 1727204236.93981: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 22690 1727204236.94075: in run() - task 127b8e07-fff9-78bb-bf56-0000000000b0 22690 1727204236.94110: variable 'ansible_search_path' from source: unknown 22690 1727204236.94114: variable 'ansible_search_path' from source: unknown 22690 1727204236.94185: calling self._execute() 22690 1727204236.94269: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.94283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.94314: variable 'omit' from source: magic vars 22690 1727204236.94775: variable 'ansible_distribution' from source: facts 22690 1727204236.94831: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22690 1727204236.94835: when evaluation is False, skipping this task 22690 1727204236.94838: _execute() done 22690 1727204236.94841: dumping result to json 22690 1727204236.94848: done dumping result, returning 22690 1727204236.94851: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [127b8e07-fff9-78bb-bf56-0000000000b0] 22690 1727204236.94854: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000b0 22690 1727204236.94941: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000b0 22690 1727204236.94944: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22690 1727204236.95014: no more pending results, returning what we have 22690 1727204236.95021: results queue empty 22690 1727204236.95022: checking for any_errors_fatal 22690 1727204236.95033: done checking for any_errors_fatal 22690 1727204236.95034: checking for max_fail_percentage 22690 1727204236.95035: done checking for max_fail_percentage 22690 1727204236.95036: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.95037: done checking to see if all hosts have failed 22690 1727204236.95038: getting the remaining hosts for this loop 22690 1727204236.95039: done getting the remaining hosts for this loop 22690 1727204236.95043: getting the next task for host managed-node2 22690 1727204236.95052: done getting next task for host managed-node2 22690 1727204236.95054: ^ task is: TASK: Enable EPEL 6 22690 1727204236.95058: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.95063: getting variables 22690 1727204236.95064: in VariableManager get_vars() 22690 1727204236.95094: Calling all_inventory to load vars for managed-node2 22690 1727204236.95097: Calling groups_inventory to load vars for managed-node2 22690 1727204236.95100: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.95111: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.95114: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.95119: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.95263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.95392: done with get_vars() 22690 1727204236.95401: done getting variables 22690 1727204236.95449: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.022) 0:00:04.238 ***** 22690 1727204236.95476: entering _queue_task() for managed-node2/copy 22690 1727204236.95713: worker is 1 (out of 1 available) 22690 1727204236.95730: exiting _queue_task() for managed-node2/copy 22690 1727204236.95744: done queuing things up, now waiting for results queue to drain 22690 1727204236.95745: waiting for pending results... 22690 1727204236.95908: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 22690 1727204236.95971: in run() - task 127b8e07-fff9-78bb-bf56-0000000000b2 22690 1727204236.95988: variable 'ansible_search_path' from source: unknown 22690 1727204236.95992: variable 'ansible_search_path' from source: unknown 22690 1727204236.96023: calling self._execute() 22690 1727204236.96098: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.96102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.96172: variable 'omit' from source: magic vars 22690 1727204236.96406: variable 'ansible_distribution' from source: facts 22690 1727204236.96425: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22690 1727204236.96430: when evaluation is False, skipping this task 22690 1727204236.96433: _execute() done 22690 1727204236.96436: dumping result to json 22690 1727204236.96438: done dumping result, returning 22690 1727204236.96441: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [127b8e07-fff9-78bb-bf56-0000000000b2] 22690 1727204236.96443: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000b2 22690 1727204236.96550: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000b2 22690 1727204236.96555: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22690 1727204236.96620: no more pending results, returning what we have 22690 1727204236.96625: results queue empty 22690 1727204236.96625: checking for any_errors_fatal 22690 1727204236.96630: done checking for any_errors_fatal 22690 1727204236.96631: checking for max_fail_percentage 22690 1727204236.96632: done checking for max_fail_percentage 22690 1727204236.96633: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.96634: done checking to see if all hosts have failed 22690 1727204236.96634: getting the remaining hosts for this loop 22690 1727204236.96636: done getting the remaining hosts for this loop 22690 1727204236.96640: getting the next task for host managed-node2 22690 1727204236.96647: done getting next task for host managed-node2 22690 1727204236.96650: ^ task is: TASK: Set network provider to 'nm' 22690 1727204236.96652: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.96655: getting variables 22690 1727204236.96656: in VariableManager get_vars() 22690 1727204236.96685: Calling all_inventory to load vars for managed-node2 22690 1727204236.96688: Calling groups_inventory to load vars for managed-node2 22690 1727204236.96691: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.96701: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.96704: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.96707: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.96897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.97042: done with get_vars() 22690 1727204236.97054: done getting variables 22690 1727204236.97122: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Tuesday 24 September 2024 14:57:16 -0400 (0:00:00.016) 0:00:04.254 ***** 22690 1727204236.97150: entering _queue_task() for managed-node2/set_fact 22690 1727204236.97445: worker is 1 (out of 1 available) 22690 1727204236.97458: exiting _queue_task() for managed-node2/set_fact 22690 1727204236.97474: done queuing things up, now waiting for results queue to drain 22690 1727204236.97475: waiting for pending results... 22690 1727204236.97762: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 22690 1727204236.97820: in run() - task 127b8e07-fff9-78bb-bf56-000000000007 22690 1727204236.97833: variable 'ansible_search_path' from source: unknown 22690 1727204236.98071: calling self._execute() 22690 1727204236.98075: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.98078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.98082: variable 'omit' from source: magic vars 22690 1727204236.98102: variable 'omit' from source: magic vars 22690 1727204236.98140: variable 'omit' from source: magic vars 22690 1727204236.98186: variable 'omit' from source: magic vars 22690 1727204236.98239: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204236.98292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204236.98323: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204236.98349: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204236.98372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204236.98409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204236.98420: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.98431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.98542: Set connection var ansible_connection to ssh 22690 1727204236.98560: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204236.98575: Set connection var ansible_pipelining to False 22690 1727204236.98583: Set connection var ansible_shell_type to sh 22690 1727204236.98592: Set connection var ansible_shell_executable to /bin/sh 22690 1727204236.98605: Set connection var ansible_timeout to 10 22690 1727204236.98632: variable 'ansible_shell_executable' from source: unknown 22690 1727204236.98645: variable 'ansible_connection' from source: unknown 22690 1727204236.98653: variable 'ansible_module_compression' from source: unknown 22690 1727204236.98659: variable 'ansible_shell_type' from source: unknown 22690 1727204236.98667: variable 'ansible_shell_executable' from source: unknown 22690 1727204236.98676: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204236.98683: variable 'ansible_pipelining' from source: unknown 22690 1727204236.98690: variable 'ansible_timeout' from source: unknown 22690 1727204236.98696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204236.98868: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204236.98890: variable 'omit' from source: magic vars 22690 1727204236.98900: starting attempt loop 22690 1727204236.98907: running the handler 22690 1727204236.98923: handler run complete 22690 1727204236.98937: attempt loop complete, returning result 22690 1727204236.98944: _execute() done 22690 1727204236.98952: dumping result to json 22690 1727204236.98960: done dumping result, returning 22690 1727204236.98978: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [127b8e07-fff9-78bb-bf56-000000000007] 22690 1727204236.99012: sending task result for task 127b8e07-fff9-78bb-bf56-000000000007 22690 1727204236.99109: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000007 22690 1727204236.99114: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 22690 1727204236.99179: no more pending results, returning what we have 22690 1727204236.99183: results queue empty 22690 1727204236.99184: checking for any_errors_fatal 22690 1727204236.99190: done checking for any_errors_fatal 22690 1727204236.99191: checking for max_fail_percentage 22690 1727204236.99193: done checking for max_fail_percentage 22690 1727204236.99194: checking to see if all hosts have failed and the running result is not ok 22690 1727204236.99195: done checking to see if all hosts have failed 22690 1727204236.99196: getting the remaining hosts for this loop 22690 1727204236.99197: done getting the remaining hosts for this loop 22690 1727204236.99202: getting the next task for host managed-node2 22690 1727204236.99209: done getting next task for host managed-node2 22690 1727204236.99211: ^ task is: TASK: meta (flush_handlers) 22690 1727204236.99213: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204236.99218: getting variables 22690 1727204236.99219: in VariableManager get_vars() 22690 1727204236.99251: Calling all_inventory to load vars for managed-node2 22690 1727204236.99253: Calling groups_inventory to load vars for managed-node2 22690 1727204236.99261: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.99275: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.99277: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.99280: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.99439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204236.99562: done with get_vars() 22690 1727204236.99573: done getting variables 22690 1727204236.99630: in VariableManager get_vars() 22690 1727204236.99638: Calling all_inventory to load vars for managed-node2 22690 1727204236.99640: Calling groups_inventory to load vars for managed-node2 22690 1727204236.99642: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204236.99645: Calling all_plugins_play to load vars for managed-node2 22690 1727204236.99647: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204236.99648: Calling groups_plugins_play to load vars for managed-node2 22690 1727204236.99905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204237.00021: done with get_vars() 22690 1727204237.00032: done queuing things up, now waiting for results queue to drain 22690 1727204237.00033: results queue empty 22690 1727204237.00034: checking for any_errors_fatal 22690 1727204237.00035: done checking for any_errors_fatal 22690 1727204237.00036: checking for max_fail_percentage 22690 1727204237.00036: done checking for max_fail_percentage 22690 1727204237.00037: checking to see if all hosts have failed and the running result is not ok 22690 1727204237.00037: done checking to see if all hosts have failed 22690 1727204237.00038: getting the remaining hosts for this loop 22690 1727204237.00038: done getting the remaining hosts for this loop 22690 1727204237.00040: getting the next task for host managed-node2 22690 1727204237.00043: done getting next task for host managed-node2 22690 1727204237.00044: ^ task is: TASK: meta (flush_handlers) 22690 1727204237.00046: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204237.00053: getting variables 22690 1727204237.00054: in VariableManager get_vars() 22690 1727204237.00062: Calling all_inventory to load vars for managed-node2 22690 1727204237.00063: Calling groups_inventory to load vars for managed-node2 22690 1727204237.00066: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204237.00071: Calling all_plugins_play to load vars for managed-node2 22690 1727204237.00073: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204237.00074: Calling groups_plugins_play to load vars for managed-node2 22690 1727204237.00156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204237.00276: done with get_vars() 22690 1727204237.00283: done getting variables 22690 1727204237.00320: in VariableManager get_vars() 22690 1727204237.00326: Calling all_inventory to load vars for managed-node2 22690 1727204237.00328: Calling groups_inventory to load vars for managed-node2 22690 1727204237.00329: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204237.00333: Calling all_plugins_play to load vars for managed-node2 22690 1727204237.00334: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204237.00336: Calling groups_plugins_play to load vars for managed-node2 22690 1727204237.00440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204237.00563: done with get_vars() 22690 1727204237.00576: done queuing things up, now waiting for results queue to drain 22690 1727204237.00578: results queue empty 22690 1727204237.00578: checking for any_errors_fatal 22690 1727204237.00579: done checking for any_errors_fatal 22690 1727204237.00580: checking for max_fail_percentage 22690 1727204237.00580: done checking for max_fail_percentage 22690 1727204237.00581: checking to see if all hosts have failed and the running result is not ok 22690 1727204237.00581: done checking to see if all hosts have failed 22690 1727204237.00582: getting the remaining hosts for this loop 22690 1727204237.00582: done getting the remaining hosts for this loop 22690 1727204237.00584: getting the next task for host managed-node2 22690 1727204237.00587: done getting next task for host managed-node2 22690 1727204237.00587: ^ task is: None 22690 1727204237.00588: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204237.00589: done queuing things up, now waiting for results queue to drain 22690 1727204237.00589: results queue empty 22690 1727204237.00590: checking for any_errors_fatal 22690 1727204237.00590: done checking for any_errors_fatal 22690 1727204237.00591: checking for max_fail_percentage 22690 1727204237.00591: done checking for max_fail_percentage 22690 1727204237.00592: checking to see if all hosts have failed and the running result is not ok 22690 1727204237.00592: done checking to see if all hosts have failed 22690 1727204237.00594: getting the next task for host managed-node2 22690 1727204237.00596: done getting next task for host managed-node2 22690 1727204237.00597: ^ task is: None 22690 1727204237.00598: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204237.00642: in VariableManager get_vars() 22690 1727204237.00653: done with get_vars() 22690 1727204237.00657: in VariableManager get_vars() 22690 1727204237.00662: done with get_vars() 22690 1727204237.00667: variable 'omit' from source: magic vars 22690 1727204237.00689: in VariableManager get_vars() 22690 1727204237.00695: done with get_vars() 22690 1727204237.00712: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 22690 1727204237.00857: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204237.00884: getting the remaining hosts for this loop 22690 1727204237.00885: done getting the remaining hosts for this loop 22690 1727204237.00887: getting the next task for host managed-node2 22690 1727204237.00889: done getting next task for host managed-node2 22690 1727204237.00890: ^ task is: TASK: Gathering Facts 22690 1727204237.00891: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204237.00893: getting variables 22690 1727204237.00893: in VariableManager get_vars() 22690 1727204237.00899: Calling all_inventory to load vars for managed-node2 22690 1727204237.00901: Calling groups_inventory to load vars for managed-node2 22690 1727204237.00906: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204237.00913: Calling all_plugins_play to load vars for managed-node2 22690 1727204237.00932: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204237.00940: Calling groups_plugins_play to load vars for managed-node2 22690 1727204237.01103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204237.01311: done with get_vars() 22690 1727204237.01319: done getting variables 22690 1727204237.01363: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Tuesday 24 September 2024 14:57:17 -0400 (0:00:00.042) 0:00:04.297 ***** 22690 1727204237.01389: entering _queue_task() for managed-node2/gather_facts 22690 1727204237.01740: worker is 1 (out of 1 available) 22690 1727204237.01754: exiting _queue_task() for managed-node2/gather_facts 22690 1727204237.01898: done queuing things up, now waiting for results queue to drain 22690 1727204237.01900: waiting for pending results... 22690 1727204237.02040: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204237.02101: in run() - task 127b8e07-fff9-78bb-bf56-0000000000d8 22690 1727204237.02115: variable 'ansible_search_path' from source: unknown 22690 1727204237.02150: calling self._execute() 22690 1727204237.02223: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204237.02227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204237.02235: variable 'omit' from source: magic vars 22690 1727204237.02553: variable 'ansible_distribution_major_version' from source: facts 22690 1727204237.02566: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204237.02573: variable 'omit' from source: magic vars 22690 1727204237.02596: variable 'omit' from source: magic vars 22690 1727204237.02628: variable 'omit' from source: magic vars 22690 1727204237.02664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204237.02697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204237.02717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204237.02734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204237.02745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204237.02771: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204237.02775: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204237.02779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204237.02856: Set connection var ansible_connection to ssh 22690 1727204237.02867: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204237.02875: Set connection var ansible_pipelining to False 22690 1727204237.02878: Set connection var ansible_shell_type to sh 22690 1727204237.02883: Set connection var ansible_shell_executable to /bin/sh 22690 1727204237.02896: Set connection var ansible_timeout to 10 22690 1727204237.02911: variable 'ansible_shell_executable' from source: unknown 22690 1727204237.02914: variable 'ansible_connection' from source: unknown 22690 1727204237.02917: variable 'ansible_module_compression' from source: unknown 22690 1727204237.02924: variable 'ansible_shell_type' from source: unknown 22690 1727204237.02927: variable 'ansible_shell_executable' from source: unknown 22690 1727204237.02930: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204237.02932: variable 'ansible_pipelining' from source: unknown 22690 1727204237.02934: variable 'ansible_timeout' from source: unknown 22690 1727204237.02943: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204237.03093: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204237.03102: variable 'omit' from source: magic vars 22690 1727204237.03107: starting attempt loop 22690 1727204237.03110: running the handler 22690 1727204237.03128: variable 'ansible_facts' from source: unknown 22690 1727204237.03146: _low_level_execute_command(): starting 22690 1727204237.03155: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204237.03778: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204237.03783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204237.03786: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204237.03789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204237.03833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204237.03836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204237.03838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204237.03922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204237.06362: stdout chunk (state=3): >>>/root <<< 22690 1727204237.06624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204237.06628: stdout chunk (state=3): >>><<< 22690 1727204237.06631: stderr chunk (state=3): >>><<< 22690 1727204237.06768: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204237.06773: _low_level_execute_command(): starting 22690 1727204237.06776: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702 `" && echo ansible-tmp-1727204237.066628-22925-13226551343702="` echo /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702 `" ) && sleep 0' 22690 1727204237.07312: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204237.07327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204237.07352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204237.07391: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204237.07404: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204237.07495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204237.10370: stdout chunk (state=3): >>>ansible-tmp-1727204237.066628-22925-13226551343702=/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702 <<< 22690 1727204237.10650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204237.10654: stdout chunk (state=3): >>><<< 22690 1727204237.10657: stderr chunk (state=3): >>><<< 22690 1727204237.10873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204237.066628-22925-13226551343702=/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204237.10877: variable 'ansible_module_compression' from source: unknown 22690 1727204237.10880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204237.10882: variable 'ansible_facts' from source: unknown 22690 1727204237.11080: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py 22690 1727204237.11482: Sending initial data 22690 1727204237.11491: Sent initial data (152 bytes) 22690 1727204237.12116: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204237.12219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204237.12250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204237.12278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204237.12292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204237.12406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204237.14753: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204237.14854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204237.14940: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmphln21ruv /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py <<< 22690 1727204237.14943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py" <<< 22690 1727204237.15048: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmphln21ruv" to remote "/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py" <<< 22690 1727204237.17029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204237.17034: stderr chunk (state=3): >>><<< 22690 1727204237.17036: stdout chunk (state=3): >>><<< 22690 1727204237.17038: done transferring module to remote 22690 1727204237.17041: _low_level_execute_command(): starting 22690 1727204237.17043: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/ /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py && sleep 0' 22690 1727204237.17669: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204237.17692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204237.17713: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204237.17819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204237.20506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204237.20608: stderr chunk (state=3): >>><<< 22690 1727204237.20618: stdout chunk (state=3): >>><<< 22690 1727204237.20873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204237.20877: _low_level_execute_command(): starting 22690 1727204237.20880: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/AnsiballZ_setup.py && sleep 0' 22690 1727204237.22307: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204237.22436: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204237.22772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204237.22776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204237.22779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204237.22782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.09396: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "17", "epoch": "1727204237", "epoch_int": "1727204237", "date": "2024-09-24", "time": "14:57:17", "iso8601_micro": "2024-09-24T18:57:17.703525Z", "iso8601": "2024-09-24T18:57:17Z", "iso8601_basic": "20240924T145717703525", "iso8601_basic_short": "20240924T145717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3030, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible<<< 22690 1727204238.09416: stdout chunk (state=3): >>>_memory_mb": {"real": {"total": 3716, "used": 686, "free": 3030}, "nocache": {"free": 3462, "used": 254}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 584, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316666368, "block_size": 4096, "block_total": 64479564, "block_available": 61356608, "block_used": 3122956, "inode_total": 16384000, "inode_available": 16301508, "inode_used": 82492, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.4873046875, "5m": 0.498046875, "15m": 0.271484375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204238.12092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204238.12123: stderr chunk (state=3): >>><<< 22690 1727204238.12133: stdout chunk (state=3): >>><<< 22690 1727204238.12223: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "17", "epoch": "1727204237", "epoch_int": "1727204237", "date": "2024-09-24", "time": "14:57:17", "iso8601_micro": "2024-09-24T18:57:17.703525Z", "iso8601": "2024-09-24T18:57:17Z", "iso8601_basic": "20240924T145717703525", "iso8601_basic_short": "20240924T145717", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3030, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 686, "free": 3030}, "nocache": {"free": 3462, "used": 254}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 584, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316666368, "block_size": 4096, "block_total": 64479564, "block_available": 61356608, "block_used": 3122956, "inode_total": 16384000, "inode_available": 16301508, "inode_used": 82492, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_loadavg": {"1m": 0.4873046875, "5m": 0.498046875, "15m": 0.271484375}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204238.12536: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204238.12582: _low_level_execute_command(): starting 22690 1727204238.12594: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204237.066628-22925-13226551343702/ > /dev/null 2>&1 && sleep 0' 22690 1727204238.13527: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204238.13547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204238.13581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.13690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204238.13719: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.13834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.16657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204238.16661: stdout chunk (state=3): >>><<< 22690 1727204238.16664: stderr chunk (state=3): >>><<< 22690 1727204238.16866: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204238.16877: handler run complete 22690 1727204238.16881: variable 'ansible_facts' from source: unknown 22690 1727204238.16984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.17313: variable 'ansible_facts' from source: unknown 22690 1727204238.17381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.17539: attempt loop complete, returning result 22690 1727204238.17551: _execute() done 22690 1727204238.17559: dumping result to json 22690 1727204238.17595: done dumping result, returning 22690 1727204238.17639: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-0000000000d8] 22690 1727204238.17642: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000d8 ok: [managed-node2] 22690 1727204238.18528: no more pending results, returning what we have 22690 1727204238.18532: results queue empty 22690 1727204238.18533: checking for any_errors_fatal 22690 1727204238.18534: done checking for any_errors_fatal 22690 1727204238.18535: checking for max_fail_percentage 22690 1727204238.18537: done checking for max_fail_percentage 22690 1727204238.18537: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.18538: done checking to see if all hosts have failed 22690 1727204238.18539: getting the remaining hosts for this loop 22690 1727204238.18540: done getting the remaining hosts for this loop 22690 1727204238.18545: getting the next task for host managed-node2 22690 1727204238.18551: done getting next task for host managed-node2 22690 1727204238.18552: ^ task is: TASK: meta (flush_handlers) 22690 1727204238.18554: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.18557: getting variables 22690 1727204238.18559: in VariableManager get_vars() 22690 1727204238.18583: Calling all_inventory to load vars for managed-node2 22690 1727204238.18591: Calling groups_inventory to load vars for managed-node2 22690 1727204238.18594: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.18608: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.18611: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.18623: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000d8 22690 1727204238.18630: WORKER PROCESS EXITING 22690 1727204238.18635: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.18859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.19052: done with get_vars() 22690 1727204238.19064: done getting variables 22690 1727204238.19452: in VariableManager get_vars() 22690 1727204238.19465: Calling all_inventory to load vars for managed-node2 22690 1727204238.19469: Calling groups_inventory to load vars for managed-node2 22690 1727204238.19472: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.19477: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.19480: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.19483: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.19905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.20189: done with get_vars() 22690 1727204238.20209: done queuing things up, now waiting for results queue to drain 22690 1727204238.20211: results queue empty 22690 1727204238.20212: checking for any_errors_fatal 22690 1727204238.20237: done checking for any_errors_fatal 22690 1727204238.20238: checking for max_fail_percentage 22690 1727204238.20240: done checking for max_fail_percentage 22690 1727204238.20240: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.20261: done checking to see if all hosts have failed 22690 1727204238.20262: getting the remaining hosts for this loop 22690 1727204238.20263: done getting the remaining hosts for this loop 22690 1727204238.20270: getting the next task for host managed-node2 22690 1727204238.20275: done getting next task for host managed-node2 22690 1727204238.20277: ^ task is: TASK: Show inside ethernet tests 22690 1727204238.20279: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.20281: getting variables 22690 1727204238.20282: in VariableManager get_vars() 22690 1727204238.20292: Calling all_inventory to load vars for managed-node2 22690 1727204238.20295: Calling groups_inventory to load vars for managed-node2 22690 1727204238.20297: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.20305: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.20308: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.20312: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.20462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.20680: done with get_vars() 22690 1727204238.20696: done getting variables 22690 1727204238.20799: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Tuesday 24 September 2024 14:57:18 -0400 (0:00:01.194) 0:00:05.491 ***** 22690 1727204238.20828: entering _queue_task() for managed-node2/debug 22690 1727204238.20830: Creating lock for debug 22690 1727204238.21197: worker is 1 (out of 1 available) 22690 1727204238.21213: exiting _queue_task() for managed-node2/debug 22690 1727204238.21228: done queuing things up, now waiting for results queue to drain 22690 1727204238.21234: waiting for pending results... 22690 1727204238.21530: running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests 22690 1727204238.21603: in run() - task 127b8e07-fff9-78bb-bf56-00000000000b 22690 1727204238.21624: variable 'ansible_search_path' from source: unknown 22690 1727204238.21658: calling self._execute() 22690 1727204238.21729: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.21735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.21744: variable 'omit' from source: magic vars 22690 1727204238.22045: variable 'ansible_distribution_major_version' from source: facts 22690 1727204238.22057: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204238.22063: variable 'omit' from source: magic vars 22690 1727204238.22089: variable 'omit' from source: magic vars 22690 1727204238.22122: variable 'omit' from source: magic vars 22690 1727204238.22158: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204238.22192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204238.22212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204238.22230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.22242: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.22268: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204238.22271: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.22274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.22352: Set connection var ansible_connection to ssh 22690 1727204238.22360: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204238.22368: Set connection var ansible_pipelining to False 22690 1727204238.22371: Set connection var ansible_shell_type to sh 22690 1727204238.22377: Set connection var ansible_shell_executable to /bin/sh 22690 1727204238.22384: Set connection var ansible_timeout to 10 22690 1727204238.22404: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.22407: variable 'ansible_connection' from source: unknown 22690 1727204238.22410: variable 'ansible_module_compression' from source: unknown 22690 1727204238.22413: variable 'ansible_shell_type' from source: unknown 22690 1727204238.22418: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.22421: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.22423: variable 'ansible_pipelining' from source: unknown 22690 1727204238.22425: variable 'ansible_timeout' from source: unknown 22690 1727204238.22428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.22546: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204238.22559: variable 'omit' from source: magic vars 22690 1727204238.22562: starting attempt loop 22690 1727204238.22566: running the handler 22690 1727204238.22606: handler run complete 22690 1727204238.22684: attempt loop complete, returning result 22690 1727204238.22688: _execute() done 22690 1727204238.22691: dumping result to json 22690 1727204238.22693: done dumping result, returning 22690 1727204238.22699: done running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests [127b8e07-fff9-78bb-bf56-00000000000b] 22690 1727204238.22704: sending task result for task 127b8e07-fff9-78bb-bf56-00000000000b 22690 1727204238.22802: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000000b 22690 1727204238.22805: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Inside ethernet tests 22690 1727204238.22863: no more pending results, returning what we have 22690 1727204238.22868: results queue empty 22690 1727204238.22870: checking for any_errors_fatal 22690 1727204238.22871: done checking for any_errors_fatal 22690 1727204238.22872: checking for max_fail_percentage 22690 1727204238.22874: done checking for max_fail_percentage 22690 1727204238.22874: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.22875: done checking to see if all hosts have failed 22690 1727204238.22876: getting the remaining hosts for this loop 22690 1727204238.22878: done getting the remaining hosts for this loop 22690 1727204238.22882: getting the next task for host managed-node2 22690 1727204238.22889: done getting next task for host managed-node2 22690 1727204238.22892: ^ task is: TASK: Show network_provider 22690 1727204238.22894: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.22898: getting variables 22690 1727204238.22899: in VariableManager get_vars() 22690 1727204238.22942: Calling all_inventory to load vars for managed-node2 22690 1727204238.22944: Calling groups_inventory to load vars for managed-node2 22690 1727204238.22948: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.22960: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.22962: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.22967: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.23150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.23274: done with get_vars() 22690 1727204238.23283: done getting variables 22690 1727204238.23331: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.025) 0:00:05.516 ***** 22690 1727204238.23352: entering _queue_task() for managed-node2/debug 22690 1727204238.23697: worker is 1 (out of 1 available) 22690 1727204238.23711: exiting _queue_task() for managed-node2/debug 22690 1727204238.23725: done queuing things up, now waiting for results queue to drain 22690 1727204238.23727: waiting for pending results... 22690 1727204238.24096: running TaskExecutor() for managed-node2/TASK: Show network_provider 22690 1727204238.24107: in run() - task 127b8e07-fff9-78bb-bf56-00000000000c 22690 1727204238.24132: variable 'ansible_search_path' from source: unknown 22690 1727204238.24188: calling self._execute() 22690 1727204238.24308: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.24356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.24420: variable 'omit' from source: magic vars 22690 1727204238.24777: variable 'ansible_distribution_major_version' from source: facts 22690 1727204238.24789: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204238.24795: variable 'omit' from source: magic vars 22690 1727204238.24822: variable 'omit' from source: magic vars 22690 1727204238.24853: variable 'omit' from source: magic vars 22690 1727204238.24890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204238.24929: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204238.24951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204238.24967: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.24979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.25003: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204238.25006: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.25009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.25088: Set connection var ansible_connection to ssh 22690 1727204238.25098: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204238.25105: Set connection var ansible_pipelining to False 22690 1727204238.25108: Set connection var ansible_shell_type to sh 22690 1727204238.25113: Set connection var ansible_shell_executable to /bin/sh 22690 1727204238.25121: Set connection var ansible_timeout to 10 22690 1727204238.25141: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.25144: variable 'ansible_connection' from source: unknown 22690 1727204238.25147: variable 'ansible_module_compression' from source: unknown 22690 1727204238.25150: variable 'ansible_shell_type' from source: unknown 22690 1727204238.25153: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.25155: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.25160: variable 'ansible_pipelining' from source: unknown 22690 1727204238.25163: variable 'ansible_timeout' from source: unknown 22690 1727204238.25165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.25289: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204238.25296: variable 'omit' from source: magic vars 22690 1727204238.25302: starting attempt loop 22690 1727204238.25305: running the handler 22690 1727204238.25345: variable 'network_provider' from source: set_fact 22690 1727204238.25411: variable 'network_provider' from source: set_fact 22690 1727204238.25432: handler run complete 22690 1727204238.25446: attempt loop complete, returning result 22690 1727204238.25449: _execute() done 22690 1727204238.25452: dumping result to json 22690 1727204238.25456: done dumping result, returning 22690 1727204238.25464: done running TaskExecutor() for managed-node2/TASK: Show network_provider [127b8e07-fff9-78bb-bf56-00000000000c] 22690 1727204238.25468: sending task result for task 127b8e07-fff9-78bb-bf56-00000000000c 22690 1727204238.25561: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000000c 22690 1727204238.25567: WORKER PROCESS EXITING ok: [managed-node2] => { "network_provider": "nm" } 22690 1727204238.25626: no more pending results, returning what we have 22690 1727204238.25629: results queue empty 22690 1727204238.25630: checking for any_errors_fatal 22690 1727204238.25638: done checking for any_errors_fatal 22690 1727204238.25639: checking for max_fail_percentage 22690 1727204238.25641: done checking for max_fail_percentage 22690 1727204238.25641: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.25642: done checking to see if all hosts have failed 22690 1727204238.25643: getting the remaining hosts for this loop 22690 1727204238.25644: done getting the remaining hosts for this loop 22690 1727204238.25649: getting the next task for host managed-node2 22690 1727204238.25657: done getting next task for host managed-node2 22690 1727204238.25658: ^ task is: TASK: meta (flush_handlers) 22690 1727204238.25660: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.25665: getting variables 22690 1727204238.25668: in VariableManager get_vars() 22690 1727204238.25705: Calling all_inventory to load vars for managed-node2 22690 1727204238.25708: Calling groups_inventory to load vars for managed-node2 22690 1727204238.25711: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.25725: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.25727: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.25730: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.25876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.26002: done with get_vars() 22690 1727204238.26014: done getting variables 22690 1727204238.26070: in VariableManager get_vars() 22690 1727204238.26078: Calling all_inventory to load vars for managed-node2 22690 1727204238.26080: Calling groups_inventory to load vars for managed-node2 22690 1727204238.26081: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.26086: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.26088: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.26090: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.26272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.26486: done with get_vars() 22690 1727204238.26501: done queuing things up, now waiting for results queue to drain 22690 1727204238.26503: results queue empty 22690 1727204238.26504: checking for any_errors_fatal 22690 1727204238.26507: done checking for any_errors_fatal 22690 1727204238.26508: checking for max_fail_percentage 22690 1727204238.26509: done checking for max_fail_percentage 22690 1727204238.26510: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.26511: done checking to see if all hosts have failed 22690 1727204238.26511: getting the remaining hosts for this loop 22690 1727204238.26512: done getting the remaining hosts for this loop 22690 1727204238.26518: getting the next task for host managed-node2 22690 1727204238.26529: done getting next task for host managed-node2 22690 1727204238.26531: ^ task is: TASK: meta (flush_handlers) 22690 1727204238.26532: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.26535: getting variables 22690 1727204238.26536: in VariableManager get_vars() 22690 1727204238.26545: Calling all_inventory to load vars for managed-node2 22690 1727204238.26548: Calling groups_inventory to load vars for managed-node2 22690 1727204238.26550: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.26555: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.26557: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.26560: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.26735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.26959: done with get_vars() 22690 1727204238.26971: done getting variables 22690 1727204238.27045: in VariableManager get_vars() 22690 1727204238.27055: Calling all_inventory to load vars for managed-node2 22690 1727204238.27058: Calling groups_inventory to load vars for managed-node2 22690 1727204238.27061: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.27068: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.27071: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.27075: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.27277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.27497: done with get_vars() 22690 1727204238.27512: done queuing things up, now waiting for results queue to drain 22690 1727204238.27514: results queue empty 22690 1727204238.27518: checking for any_errors_fatal 22690 1727204238.27519: done checking for any_errors_fatal 22690 1727204238.27520: checking for max_fail_percentage 22690 1727204238.27521: done checking for max_fail_percentage 22690 1727204238.27522: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.27523: done checking to see if all hosts have failed 22690 1727204238.27524: getting the remaining hosts for this loop 22690 1727204238.27525: done getting the remaining hosts for this loop 22690 1727204238.27528: getting the next task for host managed-node2 22690 1727204238.27532: done getting next task for host managed-node2 22690 1727204238.27533: ^ task is: None 22690 1727204238.27535: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.27536: done queuing things up, now waiting for results queue to drain 22690 1727204238.27537: results queue empty 22690 1727204238.27537: checking for any_errors_fatal 22690 1727204238.27538: done checking for any_errors_fatal 22690 1727204238.27539: checking for max_fail_percentage 22690 1727204238.27540: done checking for max_fail_percentage 22690 1727204238.27541: checking to see if all hosts have failed and the running result is not ok 22690 1727204238.27543: done checking to see if all hosts have failed 22690 1727204238.27545: getting the next task for host managed-node2 22690 1727204238.27549: done getting next task for host managed-node2 22690 1727204238.27549: ^ task is: None 22690 1727204238.27551: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.27611: in VariableManager get_vars() 22690 1727204238.27632: done with get_vars() 22690 1727204238.27638: in VariableManager get_vars() 22690 1727204238.27648: done with get_vars() 22690 1727204238.27652: variable 'omit' from source: magic vars 22690 1727204238.27706: in VariableManager get_vars() 22690 1727204238.27720: done with get_vars() 22690 1727204238.27743: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 22690 1727204238.27980: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204238.28184: getting the remaining hosts for this loop 22690 1727204238.28186: done getting the remaining hosts for this loop 22690 1727204238.28189: getting the next task for host managed-node2 22690 1727204238.28192: done getting next task for host managed-node2 22690 1727204238.28195: ^ task is: TASK: Gathering Facts 22690 1727204238.28197: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204238.28199: getting variables 22690 1727204238.28200: in VariableManager get_vars() 22690 1727204238.28212: Calling all_inventory to load vars for managed-node2 22690 1727204238.28215: Calling groups_inventory to load vars for managed-node2 22690 1727204238.28218: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204238.28224: Calling all_plugins_play to load vars for managed-node2 22690 1727204238.28227: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204238.28231: Calling groups_plugins_play to load vars for managed-node2 22690 1727204238.28395: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204238.28596: done with get_vars() 22690 1727204238.28607: done getting variables 22690 1727204238.28661: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Tuesday 24 September 2024 14:57:18 -0400 (0:00:00.053) 0:00:05.570 ***** 22690 1727204238.28692: entering _queue_task() for managed-node2/gather_facts 22690 1727204238.29045: worker is 1 (out of 1 available) 22690 1727204238.29059: exiting _queue_task() for managed-node2/gather_facts 22690 1727204238.29076: done queuing things up, now waiting for results queue to drain 22690 1727204238.29077: waiting for pending results... 22690 1727204238.29386: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204238.29575: in run() - task 127b8e07-fff9-78bb-bf56-0000000000f0 22690 1727204238.29770: variable 'ansible_search_path' from source: unknown 22690 1727204238.29775: calling self._execute() 22690 1727204238.29778: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.29781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.29993: variable 'omit' from source: magic vars 22690 1727204238.30563: variable 'ansible_distribution_major_version' from source: facts 22690 1727204238.30588: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204238.30605: variable 'omit' from source: magic vars 22690 1727204238.30641: variable 'omit' from source: magic vars 22690 1727204238.30691: variable 'omit' from source: magic vars 22690 1727204238.30751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204238.30802: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204238.30837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204238.30927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.30930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204238.30933: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204238.30936: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.30938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.31056: Set connection var ansible_connection to ssh 22690 1727204238.31079: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204238.31092: Set connection var ansible_pipelining to False 22690 1727204238.31099: Set connection var ansible_shell_type to sh 22690 1727204238.31112: Set connection var ansible_shell_executable to /bin/sh 22690 1727204238.31127: Set connection var ansible_timeout to 10 22690 1727204238.31161: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.31171: variable 'ansible_connection' from source: unknown 22690 1727204238.31178: variable 'ansible_module_compression' from source: unknown 22690 1727204238.31253: variable 'ansible_shell_type' from source: unknown 22690 1727204238.31257: variable 'ansible_shell_executable' from source: unknown 22690 1727204238.31260: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204238.31262: variable 'ansible_pipelining' from source: unknown 22690 1727204238.31264: variable 'ansible_timeout' from source: unknown 22690 1727204238.31269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204238.31433: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204238.31453: variable 'omit' from source: magic vars 22690 1727204238.31472: starting attempt loop 22690 1727204238.31485: running the handler 22690 1727204238.31508: variable 'ansible_facts' from source: unknown 22690 1727204238.31535: _low_level_execute_command(): starting 22690 1727204238.31550: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204238.32419: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.32441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.32502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204238.32505: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.32597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.34987: stdout chunk (state=3): >>>/root <<< 22690 1727204238.35128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204238.35201: stderr chunk (state=3): >>><<< 22690 1727204238.35205: stdout chunk (state=3): >>><<< 22690 1727204238.35227: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204238.35241: _low_level_execute_command(): starting 22690 1727204238.35247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819 `" && echo ansible-tmp-1727204238.352275-22987-202482993906819="` echo /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819 `" ) && sleep 0' 22690 1727204238.35753: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.35756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204238.35759: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.35761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204238.35779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.35824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204238.35827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204238.35832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.35907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.38783: stdout chunk (state=3): >>>ansible-tmp-1727204238.352275-22987-202482993906819=/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819 <<< 22690 1727204238.38936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204238.39004: stderr chunk (state=3): >>><<< 22690 1727204238.39008: stdout chunk (state=3): >>><<< 22690 1727204238.39027: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204238.352275-22987-202482993906819=/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204238.39056: variable 'ansible_module_compression' from source: unknown 22690 1727204238.39108: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204238.39162: variable 'ansible_facts' from source: unknown 22690 1727204238.39308: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py 22690 1727204238.39547: Sending initial data 22690 1727204238.39551: Sent initial data (153 bytes) 22690 1727204238.39983: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.39989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.39992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.39995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.40052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204238.40056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204238.40058: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.40143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.42538: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204238.42616: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204238.42697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwtn1ajm9 /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py <<< 22690 1727204238.42701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py" <<< 22690 1727204238.42776: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwtn1ajm9" to remote "/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py" <<< 22690 1727204238.42778: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py" <<< 22690 1727204238.44059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204238.44141: stderr chunk (state=3): >>><<< 22690 1727204238.44145: stdout chunk (state=3): >>><<< 22690 1727204238.44167: done transferring module to remote 22690 1727204238.44179: _low_level_execute_command(): starting 22690 1727204238.44184: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/ /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py && sleep 0' 22690 1727204238.44690: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.44694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.44697: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.44699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.44759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204238.44767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204238.44770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.44835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204238.47414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204238.47479: stderr chunk (state=3): >>><<< 22690 1727204238.47483: stdout chunk (state=3): >>><<< 22690 1727204238.47497: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22690 1727204238.47500: _low_level_execute_command(): starting 22690 1727204238.47506: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/AnsiballZ_setup.py && sleep 0' 22690 1727204238.48018: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204238.48023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204238.48026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204238.48093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204238.48099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204238.48102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204238.48185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22690 1727204239.30956: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3033, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 683, "free": 3033}, "nocache": {"free": 3464, "used": 252}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 585, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316776960, "block_size": 4096, "block_total": 64479564, "block_available": 61356635, "block_used": 3122929, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.5234375, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "19", "epoch": "1727204239", "epoch_int": "1727204239", "date": "2024-09-24", "time": "14:57:19", "iso8601_micro": "2024-09-24T18:57:19.303919Z", "iso8601": "2024-09-24T18:57:19Z", "iso8601_basic": "20240924T145719303919", "iso8601_basic_short": "20240924T145719", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204239.34288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204239.34292: stdout chunk (state=3): >>><<< 22690 1727204239.34295: stderr chunk (state=3): >>><<< 22690 1727204239.34298: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3033, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 683, "free": 3033}, "nocache": {"free": 3464, "used": 252}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 585, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316776960, "block_size": 4096, "block_total": 64479564, "block_available": 61356635, "block_used": 3122929, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.60888671875, "5m": 0.5234375, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "19", "epoch": "1727204239", "epoch_int": "1727204239", "date": "2024-09-24", "time": "14:57:19", "iso8601_micro": "2024-09-24T18:57:19.303919Z", "iso8601": "2024-09-24T18:57:19Z", "iso8601_basic": "20240924T145719303919", "iso8601_basic_short": "20240924T145719", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204239.34811: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204239.34849: _low_level_execute_command(): starting 22690 1727204239.34882: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204238.352275-22987-202482993906819/ > /dev/null 2>&1 && sleep 0' 22690 1727204239.36568: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204239.36595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204239.36669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.36900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.37098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204239.39065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204239.39081: stdout chunk (state=3): >>><<< 22690 1727204239.39178: stderr chunk (state=3): >>><<< 22690 1727204239.39204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204239.39222: handler run complete 22690 1727204239.39403: variable 'ansible_facts' from source: unknown 22690 1727204239.39566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.40161: variable 'ansible_facts' from source: unknown 22690 1727204239.40288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.40468: attempt loop complete, returning result 22690 1727204239.40479: _execute() done 22690 1727204239.40488: dumping result to json 22690 1727204239.40521: done dumping result, returning 22690 1727204239.40569: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-0000000000f0] 22690 1727204239.40573: sending task result for task 127b8e07-fff9-78bb-bf56-0000000000f0 ok: [managed-node2] 22690 1727204239.41303: no more pending results, returning what we have 22690 1727204239.41306: results queue empty 22690 1727204239.41307: checking for any_errors_fatal 22690 1727204239.41308: done checking for any_errors_fatal 22690 1727204239.41309: checking for max_fail_percentage 22690 1727204239.41310: done checking for max_fail_percentage 22690 1727204239.41310: checking to see if all hosts have failed and the running result is not ok 22690 1727204239.41311: done checking to see if all hosts have failed 22690 1727204239.41311: getting the remaining hosts for this loop 22690 1727204239.41312: done getting the remaining hosts for this loop 22690 1727204239.41317: getting the next task for host managed-node2 22690 1727204239.41320: done getting next task for host managed-node2 22690 1727204239.41321: ^ task is: TASK: meta (flush_handlers) 22690 1727204239.41323: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204239.41325: getting variables 22690 1727204239.41326: in VariableManager get_vars() 22690 1727204239.41347: Calling all_inventory to load vars for managed-node2 22690 1727204239.41349: Calling groups_inventory to load vars for managed-node2 22690 1727204239.41351: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.41361: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.41363: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.41367: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.41470: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000000f0 22690 1727204239.41474: WORKER PROCESS EXITING 22690 1727204239.41485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.41611: done with get_vars() 22690 1727204239.41620: done getting variables 22690 1727204239.41677: in VariableManager get_vars() 22690 1727204239.41685: Calling all_inventory to load vars for managed-node2 22690 1727204239.41687: Calling groups_inventory to load vars for managed-node2 22690 1727204239.41688: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.41692: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.41693: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.41695: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.41816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.41969: done with get_vars() 22690 1727204239.41978: done queuing things up, now waiting for results queue to drain 22690 1727204239.41979: results queue empty 22690 1727204239.41980: checking for any_errors_fatal 22690 1727204239.41982: done checking for any_errors_fatal 22690 1727204239.41983: checking for max_fail_percentage 22690 1727204239.41983: done checking for max_fail_percentage 22690 1727204239.41984: checking to see if all hosts have failed and the running result is not ok 22690 1727204239.41984: done checking to see if all hosts have failed 22690 1727204239.41990: getting the remaining hosts for this loop 22690 1727204239.41991: done getting the remaining hosts for this loop 22690 1727204239.41996: getting the next task for host managed-node2 22690 1727204239.42001: done getting next task for host managed-node2 22690 1727204239.42004: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 22690 1727204239.42005: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204239.42007: getting variables 22690 1727204239.42008: in VariableManager get_vars() 22690 1727204239.42024: Calling all_inventory to load vars for managed-node2 22690 1727204239.42026: Calling groups_inventory to load vars for managed-node2 22690 1727204239.42028: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.42032: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.42033: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.42035: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.42141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.42273: done with get_vars() 22690 1727204239.42282: done getting variables 22690 1727204239.42323: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204239.42463: variable 'type' from source: play vars 22690 1727204239.42471: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Tuesday 24 September 2024 14:57:19 -0400 (0:00:01.138) 0:00:06.708 ***** 22690 1727204239.42508: entering _queue_task() for managed-node2/set_fact 22690 1727204239.42875: worker is 1 (out of 1 available) 22690 1727204239.42889: exiting _queue_task() for managed-node2/set_fact 22690 1727204239.42904: done queuing things up, now waiting for results queue to drain 22690 1727204239.42905: waiting for pending results... 22690 1727204239.43285: running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 22690 1727204239.43291: in run() - task 127b8e07-fff9-78bb-bf56-00000000000f 22690 1727204239.43294: variable 'ansible_search_path' from source: unknown 22690 1727204239.43309: calling self._execute() 22690 1727204239.43448: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.43463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.43483: variable 'omit' from source: magic vars 22690 1727204239.43813: variable 'ansible_distribution_major_version' from source: facts 22690 1727204239.43828: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204239.43832: variable 'omit' from source: magic vars 22690 1727204239.43859: variable 'omit' from source: magic vars 22690 1727204239.43884: variable 'type' from source: play vars 22690 1727204239.43946: variable 'type' from source: play vars 22690 1727204239.43962: variable 'interface' from source: play vars 22690 1727204239.44005: variable 'interface' from source: play vars 22690 1727204239.44017: variable 'omit' from source: magic vars 22690 1727204239.44055: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204239.44092: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204239.44108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204239.44125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204239.44137: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204239.44161: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204239.44165: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.44171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.44249: Set connection var ansible_connection to ssh 22690 1727204239.44258: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204239.44267: Set connection var ansible_pipelining to False 22690 1727204239.44270: Set connection var ansible_shell_type to sh 22690 1727204239.44277: Set connection var ansible_shell_executable to /bin/sh 22690 1727204239.44286: Set connection var ansible_timeout to 10 22690 1727204239.44306: variable 'ansible_shell_executable' from source: unknown 22690 1727204239.44309: variable 'ansible_connection' from source: unknown 22690 1727204239.44311: variable 'ansible_module_compression' from source: unknown 22690 1727204239.44314: variable 'ansible_shell_type' from source: unknown 22690 1727204239.44316: variable 'ansible_shell_executable' from source: unknown 22690 1727204239.44322: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.44325: variable 'ansible_pipelining' from source: unknown 22690 1727204239.44328: variable 'ansible_timeout' from source: unknown 22690 1727204239.44333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.44516: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204239.44527: variable 'omit' from source: magic vars 22690 1727204239.44532: starting attempt loop 22690 1727204239.44535: running the handler 22690 1727204239.44550: handler run complete 22690 1727204239.44558: attempt loop complete, returning result 22690 1727204239.44562: _execute() done 22690 1727204239.44564: dumping result to json 22690 1727204239.44568: done dumping result, returning 22690 1727204239.44576: done running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 [127b8e07-fff9-78bb-bf56-00000000000f] 22690 1727204239.44581: sending task result for task 127b8e07-fff9-78bb-bf56-00000000000f 22690 1727204239.44671: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000000f 22690 1727204239.44674: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 22690 1727204239.44735: no more pending results, returning what we have 22690 1727204239.44739: results queue empty 22690 1727204239.44740: checking for any_errors_fatal 22690 1727204239.44742: done checking for any_errors_fatal 22690 1727204239.44743: checking for max_fail_percentage 22690 1727204239.44744: done checking for max_fail_percentage 22690 1727204239.44745: checking to see if all hosts have failed and the running result is not ok 22690 1727204239.44746: done checking to see if all hosts have failed 22690 1727204239.44747: getting the remaining hosts for this loop 22690 1727204239.44748: done getting the remaining hosts for this loop 22690 1727204239.44752: getting the next task for host managed-node2 22690 1727204239.44758: done getting next task for host managed-node2 22690 1727204239.44761: ^ task is: TASK: Include the task 'show_interfaces.yml' 22690 1727204239.44763: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204239.44773: getting variables 22690 1727204239.44775: in VariableManager get_vars() 22690 1727204239.44813: Calling all_inventory to load vars for managed-node2 22690 1727204239.44816: Calling groups_inventory to load vars for managed-node2 22690 1727204239.44819: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.44831: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.44833: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.44836: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.45030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.45151: done with get_vars() 22690 1727204239.45160: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.027) 0:00:06.735 ***** 22690 1727204239.45236: entering _queue_task() for managed-node2/include_tasks 22690 1727204239.45564: worker is 1 (out of 1 available) 22690 1727204239.45576: exiting _queue_task() for managed-node2/include_tasks 22690 1727204239.45590: done queuing things up, now waiting for results queue to drain 22690 1727204239.45591: waiting for pending results... 22690 1727204239.46092: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 22690 1727204239.46098: in run() - task 127b8e07-fff9-78bb-bf56-000000000010 22690 1727204239.46100: variable 'ansible_search_path' from source: unknown 22690 1727204239.46113: calling self._execute() 22690 1727204239.46206: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.46293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.46297: variable 'omit' from source: magic vars 22690 1727204239.46638: variable 'ansible_distribution_major_version' from source: facts 22690 1727204239.46657: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204239.46670: _execute() done 22690 1727204239.46679: dumping result to json 22690 1727204239.46687: done dumping result, returning 22690 1727204239.46698: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-78bb-bf56-000000000010] 22690 1727204239.46708: sending task result for task 127b8e07-fff9-78bb-bf56-000000000010 22690 1727204239.46862: no more pending results, returning what we have 22690 1727204239.46871: in VariableManager get_vars() 22690 1727204239.46916: Calling all_inventory to load vars for managed-node2 22690 1727204239.46919: Calling groups_inventory to load vars for managed-node2 22690 1727204239.46924: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.46942: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.46946: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.46949: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.47460: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000010 22690 1727204239.47467: WORKER PROCESS EXITING 22690 1727204239.47498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.47722: done with get_vars() 22690 1727204239.47732: variable 'ansible_search_path' from source: unknown 22690 1727204239.47749: we have included files to process 22690 1727204239.47750: generating all_blocks data 22690 1727204239.47751: done generating all_blocks data 22690 1727204239.47752: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204239.47753: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204239.47756: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204239.47923: in VariableManager get_vars() 22690 1727204239.47942: done with get_vars() 22690 1727204239.48060: done processing included file 22690 1727204239.48062: iterating over new_blocks loaded from include file 22690 1727204239.48064: in VariableManager get_vars() 22690 1727204239.48080: done with get_vars() 22690 1727204239.48081: filtering new block on tags 22690 1727204239.48099: done filtering new block on tags 22690 1727204239.48101: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 22690 1727204239.48141: extending task lists for all hosts with included blocks 22690 1727204239.48226: done extending task lists 22690 1727204239.48227: done processing included files 22690 1727204239.48228: results queue empty 22690 1727204239.48229: checking for any_errors_fatal 22690 1727204239.48232: done checking for any_errors_fatal 22690 1727204239.48233: checking for max_fail_percentage 22690 1727204239.48234: done checking for max_fail_percentage 22690 1727204239.48235: checking to see if all hosts have failed and the running result is not ok 22690 1727204239.48236: done checking to see if all hosts have failed 22690 1727204239.48236: getting the remaining hosts for this loop 22690 1727204239.48238: done getting the remaining hosts for this loop 22690 1727204239.48240: getting the next task for host managed-node2 22690 1727204239.48244: done getting next task for host managed-node2 22690 1727204239.48246: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22690 1727204239.48249: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204239.48251: getting variables 22690 1727204239.48252: in VariableManager get_vars() 22690 1727204239.48261: Calling all_inventory to load vars for managed-node2 22690 1727204239.48263: Calling groups_inventory to load vars for managed-node2 22690 1727204239.48267: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.48273: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.48275: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.48278: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.48409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.48574: done with get_vars() 22690 1727204239.48584: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.034) 0:00:06.769 ***** 22690 1727204239.48658: entering _queue_task() for managed-node2/include_tasks 22690 1727204239.48997: worker is 1 (out of 1 available) 22690 1727204239.49012: exiting _queue_task() for managed-node2/include_tasks 22690 1727204239.49026: done queuing things up, now waiting for results queue to drain 22690 1727204239.49028: waiting for pending results... 22690 1727204239.49313: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 22690 1727204239.49432: in run() - task 127b8e07-fff9-78bb-bf56-000000000104 22690 1727204239.49459: variable 'ansible_search_path' from source: unknown 22690 1727204239.49471: variable 'ansible_search_path' from source: unknown 22690 1727204239.49522: calling self._execute() 22690 1727204239.49618: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.49631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.49648: variable 'omit' from source: magic vars 22690 1727204239.50070: variable 'ansible_distribution_major_version' from source: facts 22690 1727204239.50089: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204239.50101: _execute() done 22690 1727204239.50109: dumping result to json 22690 1727204239.50117: done dumping result, returning 22690 1727204239.50127: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-78bb-bf56-000000000104] 22690 1727204239.50137: sending task result for task 127b8e07-fff9-78bb-bf56-000000000104 22690 1727204239.50399: no more pending results, returning what we have 22690 1727204239.50406: in VariableManager get_vars() 22690 1727204239.50446: Calling all_inventory to load vars for managed-node2 22690 1727204239.50450: Calling groups_inventory to load vars for managed-node2 22690 1727204239.50454: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.50474: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.50478: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.50481: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.50905: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000104 22690 1727204239.50910: WORKER PROCESS EXITING 22690 1727204239.50939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.51119: done with get_vars() 22690 1727204239.51128: variable 'ansible_search_path' from source: unknown 22690 1727204239.51129: variable 'ansible_search_path' from source: unknown 22690 1727204239.51173: we have included files to process 22690 1727204239.51174: generating all_blocks data 22690 1727204239.51176: done generating all_blocks data 22690 1727204239.51177: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204239.51178: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204239.51181: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204239.51508: done processing included file 22690 1727204239.51510: iterating over new_blocks loaded from include file 22690 1727204239.51512: in VariableManager get_vars() 22690 1727204239.51527: done with get_vars() 22690 1727204239.51528: filtering new block on tags 22690 1727204239.51546: done filtering new block on tags 22690 1727204239.51548: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 22690 1727204239.51553: extending task lists for all hosts with included blocks 22690 1727204239.51654: done extending task lists 22690 1727204239.51656: done processing included files 22690 1727204239.51656: results queue empty 22690 1727204239.51657: checking for any_errors_fatal 22690 1727204239.51660: done checking for any_errors_fatal 22690 1727204239.51661: checking for max_fail_percentage 22690 1727204239.51662: done checking for max_fail_percentage 22690 1727204239.51663: checking to see if all hosts have failed and the running result is not ok 22690 1727204239.51664: done checking to see if all hosts have failed 22690 1727204239.51664: getting the remaining hosts for this loop 22690 1727204239.51667: done getting the remaining hosts for this loop 22690 1727204239.51670: getting the next task for host managed-node2 22690 1727204239.51674: done getting next task for host managed-node2 22690 1727204239.51677: ^ task is: TASK: Gather current interface info 22690 1727204239.51680: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204239.51682: getting variables 22690 1727204239.51683: in VariableManager get_vars() 22690 1727204239.51693: Calling all_inventory to load vars for managed-node2 22690 1727204239.51696: Calling groups_inventory to load vars for managed-node2 22690 1727204239.51698: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204239.51704: Calling all_plugins_play to load vars for managed-node2 22690 1727204239.51706: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204239.51709: Calling groups_plugins_play to load vars for managed-node2 22690 1727204239.51849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204239.52019: done with get_vars() 22690 1727204239.52030: done getting variables 22690 1727204239.52077: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.034) 0:00:06.804 ***** 22690 1727204239.52107: entering _queue_task() for managed-node2/command 22690 1727204239.52444: worker is 1 (out of 1 available) 22690 1727204239.52458: exiting _queue_task() for managed-node2/command 22690 1727204239.52473: done queuing things up, now waiting for results queue to drain 22690 1727204239.52475: waiting for pending results... 22690 1727204239.52749: running TaskExecutor() for managed-node2/TASK: Gather current interface info 22690 1727204239.52881: in run() - task 127b8e07-fff9-78bb-bf56-000000000115 22690 1727204239.52903: variable 'ansible_search_path' from source: unknown 22690 1727204239.52911: variable 'ansible_search_path' from source: unknown 22690 1727204239.52957: calling self._execute() 22690 1727204239.53049: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.53062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.53084: variable 'omit' from source: magic vars 22690 1727204239.53918: variable 'ansible_distribution_major_version' from source: facts 22690 1727204239.53939: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204239.53957: variable 'omit' from source: magic vars 22690 1727204239.54018: variable 'omit' from source: magic vars 22690 1727204239.54271: variable 'omit' from source: magic vars 22690 1727204239.54275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204239.54278: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204239.54281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204239.54283: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204239.54285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204239.54287: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204239.54289: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.54291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.54369: Set connection var ansible_connection to ssh 22690 1727204239.54386: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204239.54401: Set connection var ansible_pipelining to False 22690 1727204239.54410: Set connection var ansible_shell_type to sh 22690 1727204239.54421: Set connection var ansible_shell_executable to /bin/sh 22690 1727204239.54432: Set connection var ansible_timeout to 10 22690 1727204239.54459: variable 'ansible_shell_executable' from source: unknown 22690 1727204239.54467: variable 'ansible_connection' from source: unknown 22690 1727204239.54474: variable 'ansible_module_compression' from source: unknown 22690 1727204239.54480: variable 'ansible_shell_type' from source: unknown 22690 1727204239.54516: variable 'ansible_shell_executable' from source: unknown 22690 1727204239.54519: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204239.54521: variable 'ansible_pipelining' from source: unknown 22690 1727204239.54523: variable 'ansible_timeout' from source: unknown 22690 1727204239.54525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204239.54667: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204239.54685: variable 'omit' from source: magic vars 22690 1727204239.54733: starting attempt loop 22690 1727204239.54737: running the handler 22690 1727204239.54739: _low_level_execute_command(): starting 22690 1727204239.54741: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204239.55509: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204239.55586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204239.55642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204239.55658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.55687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.55796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204239.57490: stdout chunk (state=3): >>>/root <<< 22690 1727204239.57674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204239.57678: stdout chunk (state=3): >>><<< 22690 1727204239.57680: stderr chunk (state=3): >>><<< 22690 1727204239.57705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204239.57811: _low_level_execute_command(): starting 22690 1727204239.57815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847 `" && echo ansible-tmp-1727204239.5771194-23042-254448655195847="` echo /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847 `" ) && sleep 0' 22690 1727204239.58413: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204239.58431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204239.58448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204239.58490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204239.58528: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204239.58585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204239.58643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204239.58663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.58695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.58807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204239.60854: stdout chunk (state=3): >>>ansible-tmp-1727204239.5771194-23042-254448655195847=/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847 <<< 22690 1727204239.61006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204239.61036: stdout chunk (state=3): >>><<< 22690 1727204239.61041: stderr chunk (state=3): >>><<< 22690 1727204239.61077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204239.5771194-23042-254448655195847=/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204239.61274: variable 'ansible_module_compression' from source: unknown 22690 1727204239.61278: ANSIBALLZ: Using generic lock for ansible.legacy.command 22690 1727204239.61280: ANSIBALLZ: Acquiring lock 22690 1727204239.61283: ANSIBALLZ: Lock acquired: 139846653776800 22690 1727204239.61286: ANSIBALLZ: Creating module 22690 1727204239.78391: ANSIBALLZ: Writing module into payload 22690 1727204239.78504: ANSIBALLZ: Writing module 22690 1727204239.78540: ANSIBALLZ: Renaming module 22690 1727204239.78554: ANSIBALLZ: Done creating module 22690 1727204239.78584: variable 'ansible_facts' from source: unknown 22690 1727204239.78671: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py 22690 1727204239.78830: Sending initial data 22690 1727204239.78930: Sent initial data (156 bytes) 22690 1727204239.79568: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204239.79594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204239.79708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.79739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.79982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204239.81531: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204239.81627: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204239.81729: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp50icfmum /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py <<< 22690 1727204239.81739: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py" <<< 22690 1727204239.81790: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp50icfmum" to remote "/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py" <<< 22690 1727204239.82792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204239.82809: stdout chunk (state=3): >>><<< 22690 1727204239.82824: stderr chunk (state=3): >>><<< 22690 1727204239.82852: done transferring module to remote 22690 1727204239.82871: _low_level_execute_command(): starting 22690 1727204239.82880: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/ /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py && sleep 0' 22690 1727204239.83577: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204239.83697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.83739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.83845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204239.85808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204239.85812: stdout chunk (state=3): >>><<< 22690 1727204239.85815: stderr chunk (state=3): >>><<< 22690 1727204239.85836: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204239.85932: _low_level_execute_command(): starting 22690 1727204239.85936: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/AnsiballZ_command.py && sleep 0' 22690 1727204239.86575: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204239.86592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204239.86613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204239.86634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204239.86735: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204239.86754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204239.86775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204239.86800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204239.86912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.03906: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:20.034029", "end": "2024-09-24 14:57:20.037620", "delta": "0:00:00.003591", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204240.05608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204240.05680: stderr chunk (state=3): >>><<< 22690 1727204240.05684: stdout chunk (state=3): >>><<< 22690 1727204240.05701: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:20.034029", "end": "2024-09-24 14:57:20.037620", "delta": "0:00:00.003591", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204240.05740: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204240.05747: _low_level_execute_command(): starting 22690 1727204240.05753: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204239.5771194-23042-254448655195847/ > /dev/null 2>&1 && sleep 0' 22690 1727204240.06270: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.06274: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.06278: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.06281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.06337: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204240.06341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.06417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.08375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.08450: stderr chunk (state=3): >>><<< 22690 1727204240.08454: stdout chunk (state=3): >>><<< 22690 1727204240.08672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.08676: handler run complete 22690 1727204240.08678: Evaluated conditional (False): False 22690 1727204240.08681: attempt loop complete, returning result 22690 1727204240.08683: _execute() done 22690 1727204240.08684: dumping result to json 22690 1727204240.08686: done dumping result, returning 22690 1727204240.08688: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-78bb-bf56-000000000115] 22690 1727204240.08690: sending task result for task 127b8e07-fff9-78bb-bf56-000000000115 22690 1727204240.08762: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000115 22690 1727204240.08764: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003591", "end": "2024-09-24 14:57:20.037620", "rc": 0, "start": "2024-09-24 14:57:20.034029" } STDOUT: bonding_masters eth0 lo 22690 1727204240.09397: no more pending results, returning what we have 22690 1727204240.09401: results queue empty 22690 1727204240.09401: checking for any_errors_fatal 22690 1727204240.09403: done checking for any_errors_fatal 22690 1727204240.09404: checking for max_fail_percentage 22690 1727204240.09405: done checking for max_fail_percentage 22690 1727204240.09406: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.09407: done checking to see if all hosts have failed 22690 1727204240.09408: getting the remaining hosts for this loop 22690 1727204240.09409: done getting the remaining hosts for this loop 22690 1727204240.09413: getting the next task for host managed-node2 22690 1727204240.09420: done getting next task for host managed-node2 22690 1727204240.09422: ^ task is: TASK: Set current_interfaces 22690 1727204240.09426: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.09428: getting variables 22690 1727204240.09429: in VariableManager get_vars() 22690 1727204240.09454: Calling all_inventory to load vars for managed-node2 22690 1727204240.09457: Calling groups_inventory to load vars for managed-node2 22690 1727204240.09460: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.09476: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.09478: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.09482: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.09592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.09718: done with get_vars() 22690 1727204240.09726: done getting variables 22690 1727204240.09777: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.576) 0:00:07.381 ***** 22690 1727204240.09799: entering _queue_task() for managed-node2/set_fact 22690 1727204240.10054: worker is 1 (out of 1 available) 22690 1727204240.10068: exiting _queue_task() for managed-node2/set_fact 22690 1727204240.10082: done queuing things up, now waiting for results queue to drain 22690 1727204240.10083: waiting for pending results... 22690 1727204240.10251: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 22690 1727204240.10328: in run() - task 127b8e07-fff9-78bb-bf56-000000000116 22690 1727204240.10340: variable 'ansible_search_path' from source: unknown 22690 1727204240.10345: variable 'ansible_search_path' from source: unknown 22690 1727204240.10380: calling self._execute() 22690 1727204240.10453: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.10459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.10473: variable 'omit' from source: magic vars 22690 1727204240.10776: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.10788: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.10794: variable 'omit' from source: magic vars 22690 1727204240.10837: variable 'omit' from source: magic vars 22690 1727204240.10927: variable '_current_interfaces' from source: set_fact 22690 1727204240.10982: variable 'omit' from source: magic vars 22690 1727204240.11020: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.11050: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.11068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.11086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.11097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.11123: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.11126: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.11129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.11204: Set connection var ansible_connection to ssh 22690 1727204240.11214: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.11222: Set connection var ansible_pipelining to False 22690 1727204240.11225: Set connection var ansible_shell_type to sh 22690 1727204240.11231: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.11238: Set connection var ansible_timeout to 10 22690 1727204240.11258: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.11262: variable 'ansible_connection' from source: unknown 22690 1727204240.11266: variable 'ansible_module_compression' from source: unknown 22690 1727204240.11269: variable 'ansible_shell_type' from source: unknown 22690 1727204240.11272: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.11274: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.11277: variable 'ansible_pipelining' from source: unknown 22690 1727204240.11279: variable 'ansible_timeout' from source: unknown 22690 1727204240.11295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.11400: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.11411: variable 'omit' from source: magic vars 22690 1727204240.11415: starting attempt loop 22690 1727204240.11420: running the handler 22690 1727204240.11430: handler run complete 22690 1727204240.11438: attempt loop complete, returning result 22690 1727204240.11441: _execute() done 22690 1727204240.11444: dumping result to json 22690 1727204240.11448: done dumping result, returning 22690 1727204240.11456: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-78bb-bf56-000000000116] 22690 1727204240.11458: sending task result for task 127b8e07-fff9-78bb-bf56-000000000116 22690 1727204240.11551: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000116 22690 1727204240.11554: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22690 1727204240.11624: no more pending results, returning what we have 22690 1727204240.11627: results queue empty 22690 1727204240.11628: checking for any_errors_fatal 22690 1727204240.11639: done checking for any_errors_fatal 22690 1727204240.11639: checking for max_fail_percentage 22690 1727204240.11641: done checking for max_fail_percentage 22690 1727204240.11642: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.11643: done checking to see if all hosts have failed 22690 1727204240.11644: getting the remaining hosts for this loop 22690 1727204240.11645: done getting the remaining hosts for this loop 22690 1727204240.11650: getting the next task for host managed-node2 22690 1727204240.11658: done getting next task for host managed-node2 22690 1727204240.11661: ^ task is: TASK: Show current_interfaces 22690 1727204240.11666: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.11669: getting variables 22690 1727204240.11671: in VariableManager get_vars() 22690 1727204240.11707: Calling all_inventory to load vars for managed-node2 22690 1727204240.11710: Calling groups_inventory to load vars for managed-node2 22690 1727204240.11713: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.11726: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.11728: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.11731: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.11873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.12024: done with get_vars() 22690 1727204240.12032: done getting variables 22690 1727204240.12080: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.023) 0:00:07.404 ***** 22690 1727204240.12104: entering _queue_task() for managed-node2/debug 22690 1727204240.12338: worker is 1 (out of 1 available) 22690 1727204240.12354: exiting _queue_task() for managed-node2/debug 22690 1727204240.12370: done queuing things up, now waiting for results queue to drain 22690 1727204240.12371: waiting for pending results... 22690 1727204240.12546: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 22690 1727204240.12621: in run() - task 127b8e07-fff9-78bb-bf56-000000000105 22690 1727204240.12634: variable 'ansible_search_path' from source: unknown 22690 1727204240.12638: variable 'ansible_search_path' from source: unknown 22690 1727204240.12672: calling self._execute() 22690 1727204240.12740: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.12746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.12756: variable 'omit' from source: magic vars 22690 1727204240.13056: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.13068: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.13074: variable 'omit' from source: magic vars 22690 1727204240.13104: variable 'omit' from source: magic vars 22690 1727204240.13184: variable 'current_interfaces' from source: set_fact 22690 1727204240.13206: variable 'omit' from source: magic vars 22690 1727204240.13247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.13280: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.13299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.13314: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.13326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.13352: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.13355: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.13360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.13436: Set connection var ansible_connection to ssh 22690 1727204240.13445: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.13452: Set connection var ansible_pipelining to False 22690 1727204240.13455: Set connection var ansible_shell_type to sh 22690 1727204240.13461: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.13473: Set connection var ansible_timeout to 10 22690 1727204240.13491: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.13494: variable 'ansible_connection' from source: unknown 22690 1727204240.13497: variable 'ansible_module_compression' from source: unknown 22690 1727204240.13499: variable 'ansible_shell_type' from source: unknown 22690 1727204240.13502: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.13504: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.13508: variable 'ansible_pipelining' from source: unknown 22690 1727204240.13510: variable 'ansible_timeout' from source: unknown 22690 1727204240.13515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.13636: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.13645: variable 'omit' from source: magic vars 22690 1727204240.13650: starting attempt loop 22690 1727204240.13653: running the handler 22690 1727204240.13698: handler run complete 22690 1727204240.13714: attempt loop complete, returning result 22690 1727204240.13717: _execute() done 22690 1727204240.13722: dumping result to json 22690 1727204240.13724: done dumping result, returning 22690 1727204240.13733: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-78bb-bf56-000000000105] 22690 1727204240.13737: sending task result for task 127b8e07-fff9-78bb-bf56-000000000105 22690 1727204240.13831: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000105 22690 1727204240.13834: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22690 1727204240.13888: no more pending results, returning what we have 22690 1727204240.13891: results queue empty 22690 1727204240.13892: checking for any_errors_fatal 22690 1727204240.13897: done checking for any_errors_fatal 22690 1727204240.13897: checking for max_fail_percentage 22690 1727204240.13899: done checking for max_fail_percentage 22690 1727204240.13900: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.13901: done checking to see if all hosts have failed 22690 1727204240.13902: getting the remaining hosts for this loop 22690 1727204240.13903: done getting the remaining hosts for this loop 22690 1727204240.13907: getting the next task for host managed-node2 22690 1727204240.13915: done getting next task for host managed-node2 22690 1727204240.13919: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22690 1727204240.13921: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.13924: getting variables 22690 1727204240.13926: in VariableManager get_vars() 22690 1727204240.13958: Calling all_inventory to load vars for managed-node2 22690 1727204240.13961: Calling groups_inventory to load vars for managed-node2 22690 1727204240.13964: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.13978: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.13981: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.13984: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.14136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.14260: done with get_vars() 22690 1727204240.14271: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.022) 0:00:07.426 ***** 22690 1727204240.14345: entering _queue_task() for managed-node2/include_tasks 22690 1727204240.14588: worker is 1 (out of 1 available) 22690 1727204240.14603: exiting _queue_task() for managed-node2/include_tasks 22690 1727204240.14619: done queuing things up, now waiting for results queue to drain 22690 1727204240.14621: waiting for pending results... 22690 1727204240.14786: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 22690 1727204240.14854: in run() - task 127b8e07-fff9-78bb-bf56-000000000011 22690 1727204240.14867: variable 'ansible_search_path' from source: unknown 22690 1727204240.14901: calling self._execute() 22690 1727204240.14968: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.14977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.14985: variable 'omit' from source: magic vars 22690 1727204240.15273: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.15305: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.15308: _execute() done 22690 1727204240.15311: dumping result to json 22690 1727204240.15313: done dumping result, returning 22690 1727204240.15319: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [127b8e07-fff9-78bb-bf56-000000000011] 22690 1727204240.15321: sending task result for task 127b8e07-fff9-78bb-bf56-000000000011 22690 1727204240.15403: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000011 22690 1727204240.15405: WORKER PROCESS EXITING 22690 1727204240.15439: no more pending results, returning what we have 22690 1727204240.15444: in VariableManager get_vars() 22690 1727204240.15483: Calling all_inventory to load vars for managed-node2 22690 1727204240.15486: Calling groups_inventory to load vars for managed-node2 22690 1727204240.15490: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.15505: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.15508: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.15511: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.15719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.15842: done with get_vars() 22690 1727204240.15850: variable 'ansible_search_path' from source: unknown 22690 1727204240.15863: we have included files to process 22690 1727204240.15864: generating all_blocks data 22690 1727204240.15867: done generating all_blocks data 22690 1727204240.15871: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22690 1727204240.15872: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22690 1727204240.15873: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22690 1727204240.16261: in VariableManager get_vars() 22690 1727204240.16276: done with get_vars() 22690 1727204240.16444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 22690 1727204240.16893: done processing included file 22690 1727204240.16895: iterating over new_blocks loaded from include file 22690 1727204240.16896: in VariableManager get_vars() 22690 1727204240.16905: done with get_vars() 22690 1727204240.16906: filtering new block on tags 22690 1727204240.16929: done filtering new block on tags 22690 1727204240.16931: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 22690 1727204240.16935: extending task lists for all hosts with included blocks 22690 1727204240.17053: done extending task lists 22690 1727204240.17055: done processing included files 22690 1727204240.17056: results queue empty 22690 1727204240.17056: checking for any_errors_fatal 22690 1727204240.17058: done checking for any_errors_fatal 22690 1727204240.17059: checking for max_fail_percentage 22690 1727204240.17060: done checking for max_fail_percentage 22690 1727204240.17060: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.17061: done checking to see if all hosts have failed 22690 1727204240.17061: getting the remaining hosts for this loop 22690 1727204240.17062: done getting the remaining hosts for this loop 22690 1727204240.17064: getting the next task for host managed-node2 22690 1727204240.17068: done getting next task for host managed-node2 22690 1727204240.17070: ^ task is: TASK: Ensure state in ["present", "absent"] 22690 1727204240.17072: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.17073: getting variables 22690 1727204240.17074: in VariableManager get_vars() 22690 1727204240.17080: Calling all_inventory to load vars for managed-node2 22690 1727204240.17082: Calling groups_inventory to load vars for managed-node2 22690 1727204240.17083: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.17087: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.17089: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.17091: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.17185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.17300: done with get_vars() 22690 1727204240.17307: done getting variables 22690 1727204240.17357: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.030) 0:00:07.457 ***** 22690 1727204240.17380: entering _queue_task() for managed-node2/fail 22690 1727204240.17382: Creating lock for fail 22690 1727204240.17646: worker is 1 (out of 1 available) 22690 1727204240.17661: exiting _queue_task() for managed-node2/fail 22690 1727204240.17676: done queuing things up, now waiting for results queue to drain 22690 1727204240.17677: waiting for pending results... 22690 1727204240.17847: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 22690 1727204240.17919: in run() - task 127b8e07-fff9-78bb-bf56-000000000131 22690 1727204240.17928: variable 'ansible_search_path' from source: unknown 22690 1727204240.17933: variable 'ansible_search_path' from source: unknown 22690 1727204240.17967: calling self._execute() 22690 1727204240.18032: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.18038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.18048: variable 'omit' from source: magic vars 22690 1727204240.18378: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.18388: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.18494: variable 'state' from source: include params 22690 1727204240.18498: Evaluated conditional (state not in ["present", "absent"]): False 22690 1727204240.18501: when evaluation is False, skipping this task 22690 1727204240.18504: _execute() done 22690 1727204240.18508: dumping result to json 22690 1727204240.18511: done dumping result, returning 22690 1727204240.18521: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [127b8e07-fff9-78bb-bf56-000000000131] 22690 1727204240.18524: sending task result for task 127b8e07-fff9-78bb-bf56-000000000131 skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22690 1727204240.18683: no more pending results, returning what we have 22690 1727204240.18689: results queue empty 22690 1727204240.18690: checking for any_errors_fatal 22690 1727204240.18692: done checking for any_errors_fatal 22690 1727204240.18693: checking for max_fail_percentage 22690 1727204240.18695: done checking for max_fail_percentage 22690 1727204240.18695: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.18697: done checking to see if all hosts have failed 22690 1727204240.18697: getting the remaining hosts for this loop 22690 1727204240.18699: done getting the remaining hosts for this loop 22690 1727204240.18703: getting the next task for host managed-node2 22690 1727204240.18709: done getting next task for host managed-node2 22690 1727204240.18712: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22690 1727204240.18717: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.18722: getting variables 22690 1727204240.18723: in VariableManager get_vars() 22690 1727204240.18755: Calling all_inventory to load vars for managed-node2 22690 1727204240.18758: Calling groups_inventory to load vars for managed-node2 22690 1727204240.18761: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.18776: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.18779: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.18783: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.18963: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000131 22690 1727204240.18968: WORKER PROCESS EXITING 22690 1727204240.18982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.19106: done with get_vars() 22690 1727204240.19114: done getting variables 22690 1727204240.19159: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.018) 0:00:07.475 ***** 22690 1727204240.19183: entering _queue_task() for managed-node2/fail 22690 1727204240.19419: worker is 1 (out of 1 available) 22690 1727204240.19433: exiting _queue_task() for managed-node2/fail 22690 1727204240.19446: done queuing things up, now waiting for results queue to drain 22690 1727204240.19448: waiting for pending results... 22690 1727204240.19619: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 22690 1727204240.19694: in run() - task 127b8e07-fff9-78bb-bf56-000000000132 22690 1727204240.19706: variable 'ansible_search_path' from source: unknown 22690 1727204240.19710: variable 'ansible_search_path' from source: unknown 22690 1727204240.19745: calling self._execute() 22690 1727204240.19812: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.19818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.19830: variable 'omit' from source: magic vars 22690 1727204240.20130: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.20141: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.20252: variable 'type' from source: set_fact 22690 1727204240.20256: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22690 1727204240.20259: when evaluation is False, skipping this task 22690 1727204240.20262: _execute() done 22690 1727204240.20268: dumping result to json 22690 1727204240.20271: done dumping result, returning 22690 1727204240.20278: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [127b8e07-fff9-78bb-bf56-000000000132] 22690 1727204240.20284: sending task result for task 127b8e07-fff9-78bb-bf56-000000000132 22690 1727204240.20381: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000132 22690 1727204240.20384: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22690 1727204240.20435: no more pending results, returning what we have 22690 1727204240.20439: results queue empty 22690 1727204240.20440: checking for any_errors_fatal 22690 1727204240.20448: done checking for any_errors_fatal 22690 1727204240.20448: checking for max_fail_percentage 22690 1727204240.20450: done checking for max_fail_percentage 22690 1727204240.20451: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.20452: done checking to see if all hosts have failed 22690 1727204240.20452: getting the remaining hosts for this loop 22690 1727204240.20454: done getting the remaining hosts for this loop 22690 1727204240.20457: getting the next task for host managed-node2 22690 1727204240.20464: done getting next task for host managed-node2 22690 1727204240.20469: ^ task is: TASK: Include the task 'show_interfaces.yml' 22690 1727204240.20473: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.20476: getting variables 22690 1727204240.20478: in VariableManager get_vars() 22690 1727204240.20508: Calling all_inventory to load vars for managed-node2 22690 1727204240.20511: Calling groups_inventory to load vars for managed-node2 22690 1727204240.20515: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.20527: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.20529: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.20532: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.20677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.20828: done with get_vars() 22690 1727204240.20835: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.017) 0:00:07.492 ***** 22690 1727204240.20909: entering _queue_task() for managed-node2/include_tasks 22690 1727204240.21137: worker is 1 (out of 1 available) 22690 1727204240.21151: exiting _queue_task() for managed-node2/include_tasks 22690 1727204240.21163: done queuing things up, now waiting for results queue to drain 22690 1727204240.21167: waiting for pending results... 22690 1727204240.21332: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 22690 1727204240.21408: in run() - task 127b8e07-fff9-78bb-bf56-000000000133 22690 1727204240.21422: variable 'ansible_search_path' from source: unknown 22690 1727204240.21427: variable 'ansible_search_path' from source: unknown 22690 1727204240.21457: calling self._execute() 22690 1727204240.21524: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.21531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.21540: variable 'omit' from source: magic vars 22690 1727204240.21839: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.21854: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.21859: _execute() done 22690 1727204240.21862: dumping result to json 22690 1727204240.21867: done dumping result, returning 22690 1727204240.21875: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [127b8e07-fff9-78bb-bf56-000000000133] 22690 1727204240.21881: sending task result for task 127b8e07-fff9-78bb-bf56-000000000133 22690 1727204240.22003: no more pending results, returning what we have 22690 1727204240.22009: in VariableManager get_vars() 22690 1727204240.22048: Calling all_inventory to load vars for managed-node2 22690 1727204240.22051: Calling groups_inventory to load vars for managed-node2 22690 1727204240.22055: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.22074: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.22077: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.22080: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.22244: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000133 22690 1727204240.22248: WORKER PROCESS EXITING 22690 1727204240.22259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.22382: done with get_vars() 22690 1727204240.22388: variable 'ansible_search_path' from source: unknown 22690 1727204240.22388: variable 'ansible_search_path' from source: unknown 22690 1727204240.22421: we have included files to process 22690 1727204240.22422: generating all_blocks data 22690 1727204240.22424: done generating all_blocks data 22690 1727204240.22428: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204240.22429: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204240.22430: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22690 1727204240.22506: in VariableManager get_vars() 22690 1727204240.22523: done with get_vars() 22690 1727204240.22609: done processing included file 22690 1727204240.22611: iterating over new_blocks loaded from include file 22690 1727204240.22612: in VariableManager get_vars() 22690 1727204240.22622: done with get_vars() 22690 1727204240.22624: filtering new block on tags 22690 1727204240.22639: done filtering new block on tags 22690 1727204240.22640: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 22690 1727204240.22644: extending task lists for all hosts with included blocks 22690 1727204240.22919: done extending task lists 22690 1727204240.22920: done processing included files 22690 1727204240.22921: results queue empty 22690 1727204240.22921: checking for any_errors_fatal 22690 1727204240.22924: done checking for any_errors_fatal 22690 1727204240.22925: checking for max_fail_percentage 22690 1727204240.22925: done checking for max_fail_percentage 22690 1727204240.22926: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.22926: done checking to see if all hosts have failed 22690 1727204240.22927: getting the remaining hosts for this loop 22690 1727204240.22928: done getting the remaining hosts for this loop 22690 1727204240.22929: getting the next task for host managed-node2 22690 1727204240.22932: done getting next task for host managed-node2 22690 1727204240.22934: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22690 1727204240.22936: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.22938: getting variables 22690 1727204240.22938: in VariableManager get_vars() 22690 1727204240.22945: Calling all_inventory to load vars for managed-node2 22690 1727204240.22946: Calling groups_inventory to load vars for managed-node2 22690 1727204240.22948: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.22954: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.22956: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.22963: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.23088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.23204: done with get_vars() 22690 1727204240.23211: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.023) 0:00:07.516 ***** 22690 1727204240.23271: entering _queue_task() for managed-node2/include_tasks 22690 1727204240.23539: worker is 1 (out of 1 available) 22690 1727204240.23553: exiting _queue_task() for managed-node2/include_tasks 22690 1727204240.23567: done queuing things up, now waiting for results queue to drain 22690 1727204240.23568: waiting for pending results... 22690 1727204240.23749: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 22690 1727204240.23834: in run() - task 127b8e07-fff9-78bb-bf56-00000000015c 22690 1727204240.23844: variable 'ansible_search_path' from source: unknown 22690 1727204240.23848: variable 'ansible_search_path' from source: unknown 22690 1727204240.23882: calling self._execute() 22690 1727204240.23948: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.23953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.23963: variable 'omit' from source: magic vars 22690 1727204240.24271: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.24282: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.24288: _execute() done 22690 1727204240.24291: dumping result to json 22690 1727204240.24296: done dumping result, returning 22690 1727204240.24302: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [127b8e07-fff9-78bb-bf56-00000000015c] 22690 1727204240.24308: sending task result for task 127b8e07-fff9-78bb-bf56-00000000015c 22690 1727204240.24405: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000015c 22690 1727204240.24409: WORKER PROCESS EXITING 22690 1727204240.24443: no more pending results, returning what we have 22690 1727204240.24451: in VariableManager get_vars() 22690 1727204240.24488: Calling all_inventory to load vars for managed-node2 22690 1727204240.24491: Calling groups_inventory to load vars for managed-node2 22690 1727204240.24495: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.24512: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.24517: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.24528: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.24702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.24849: done with get_vars() 22690 1727204240.24856: variable 'ansible_search_path' from source: unknown 22690 1727204240.24856: variable 'ansible_search_path' from source: unknown 22690 1727204240.24901: we have included files to process 22690 1727204240.24902: generating all_blocks data 22690 1727204240.24903: done generating all_blocks data 22690 1727204240.24904: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204240.24905: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204240.24906: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22690 1727204240.25116: done processing included file 22690 1727204240.25118: iterating over new_blocks loaded from include file 22690 1727204240.25119: in VariableManager get_vars() 22690 1727204240.25130: done with get_vars() 22690 1727204240.25131: filtering new block on tags 22690 1727204240.25143: done filtering new block on tags 22690 1727204240.25145: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 22690 1727204240.25148: extending task lists for all hosts with included blocks 22690 1727204240.25249: done extending task lists 22690 1727204240.25250: done processing included files 22690 1727204240.25251: results queue empty 22690 1727204240.25251: checking for any_errors_fatal 22690 1727204240.25253: done checking for any_errors_fatal 22690 1727204240.25254: checking for max_fail_percentage 22690 1727204240.25255: done checking for max_fail_percentage 22690 1727204240.25255: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.25256: done checking to see if all hosts have failed 22690 1727204240.25256: getting the remaining hosts for this loop 22690 1727204240.25257: done getting the remaining hosts for this loop 22690 1727204240.25259: getting the next task for host managed-node2 22690 1727204240.25262: done getting next task for host managed-node2 22690 1727204240.25264: ^ task is: TASK: Gather current interface info 22690 1727204240.25268: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.25270: getting variables 22690 1727204240.25270: in VariableManager get_vars() 22690 1727204240.25277: Calling all_inventory to load vars for managed-node2 22690 1727204240.25278: Calling groups_inventory to load vars for managed-node2 22690 1727204240.25280: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.25284: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.25286: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.25289: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.25389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.25506: done with get_vars() 22690 1727204240.25513: done getting variables 22690 1727204240.25545: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.022) 0:00:07.539 ***** 22690 1727204240.25569: entering _queue_task() for managed-node2/command 22690 1727204240.25829: worker is 1 (out of 1 available) 22690 1727204240.25844: exiting _queue_task() for managed-node2/command 22690 1727204240.25858: done queuing things up, now waiting for results queue to drain 22690 1727204240.25860: waiting for pending results... 22690 1727204240.26040: running TaskExecutor() for managed-node2/TASK: Gather current interface info 22690 1727204240.26115: in run() - task 127b8e07-fff9-78bb-bf56-000000000193 22690 1727204240.26130: variable 'ansible_search_path' from source: unknown 22690 1727204240.26135: variable 'ansible_search_path' from source: unknown 22690 1727204240.26171: calling self._execute() 22690 1727204240.26236: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.26264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.26272: variable 'omit' from source: magic vars 22690 1727204240.26623: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.26638: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.26641: variable 'omit' from source: magic vars 22690 1727204240.26681: variable 'omit' from source: magic vars 22690 1727204240.26707: variable 'omit' from source: magic vars 22690 1727204240.26747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.26781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.26798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.26814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.26827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.26853: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.26856: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.26860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.26936: Set connection var ansible_connection to ssh 22690 1727204240.26945: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.26958: Set connection var ansible_pipelining to False 22690 1727204240.26976: Set connection var ansible_shell_type to sh 22690 1727204240.26981: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.26987: Set connection var ansible_timeout to 10 22690 1727204240.27006: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.27009: variable 'ansible_connection' from source: unknown 22690 1727204240.27012: variable 'ansible_module_compression' from source: unknown 22690 1727204240.27014: variable 'ansible_shell_type' from source: unknown 22690 1727204240.27017: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.27022: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.27027: variable 'ansible_pipelining' from source: unknown 22690 1727204240.27029: variable 'ansible_timeout' from source: unknown 22690 1727204240.27034: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.27152: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.27162: variable 'omit' from source: magic vars 22690 1727204240.27168: starting attempt loop 22690 1727204240.27172: running the handler 22690 1727204240.27187: _low_level_execute_command(): starting 22690 1727204240.27195: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204240.27899: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.27947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.28022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.29815: stdout chunk (state=3): >>>/root <<< 22690 1727204240.30164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.30172: stdout chunk (state=3): >>><<< 22690 1727204240.30175: stderr chunk (state=3): >>><<< 22690 1727204240.30178: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.30180: _low_level_execute_command(): starting 22690 1727204240.30183: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100 `" && echo ansible-tmp-1727204240.3004935-23144-86431923645100="` echo /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100 `" ) && sleep 0' 22690 1727204240.30777: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204240.30833: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.30955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204240.30997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204240.31024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.31125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.33230: stdout chunk (state=3): >>>ansible-tmp-1727204240.3004935-23144-86431923645100=/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100 <<< 22690 1727204240.33334: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.33405: stderr chunk (state=3): >>><<< 22690 1727204240.33416: stdout chunk (state=3): >>><<< 22690 1727204240.33453: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204240.3004935-23144-86431923645100=/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.33507: variable 'ansible_module_compression' from source: unknown 22690 1727204240.33679: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204240.33683: variable 'ansible_facts' from source: unknown 22690 1727204240.33723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py 22690 1727204240.33918: Sending initial data 22690 1727204240.33929: Sent initial data (155 bytes) 22690 1727204240.34661: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204240.34807: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204240.34857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.34970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.36672: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22690 1727204240.36714: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204240.36773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204240.36873: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwp_dh1ur /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py <<< 22690 1727204240.36886: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py" <<< 22690 1727204240.36938: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwp_dh1ur" to remote "/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py" <<< 22690 1727204240.37955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.38027: stderr chunk (state=3): >>><<< 22690 1727204240.38031: stdout chunk (state=3): >>><<< 22690 1727204240.38034: done transferring module to remote 22690 1727204240.38058: _low_level_execute_command(): starting 22690 1727204240.38062: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/ /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py && sleep 0' 22690 1727204240.38565: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.38571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.38575: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204240.38578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.38631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204240.38639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.38708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.40725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.40731: stdout chunk (state=3): >>><<< 22690 1727204240.40734: stderr chunk (state=3): >>><<< 22690 1727204240.40764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.40793: _low_level_execute_command(): starting 22690 1727204240.40899: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/AnsiballZ_command.py && sleep 0' 22690 1727204240.42193: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204240.42295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204240.42392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204240.42424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.42542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.59801: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:20.593083", "end": "2024-09-24 14:57:20.596678", "delta": "0:00:00.003595", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204240.61450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204240.61518: stderr chunk (state=3): >>><<< 22690 1727204240.61522: stdout chunk (state=3): >>><<< 22690 1727204240.61536: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:20.593083", "end": "2024-09-24 14:57:20.596678", "delta": "0:00:00.003595", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204240.61575: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204240.61585: _low_level_execute_command(): starting 22690 1727204240.61590: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204240.3004935-23144-86431923645100/ > /dev/null 2>&1 && sleep 0' 22690 1727204240.62096: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.62100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.62103: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.62106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.62169: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204240.62172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204240.62179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.62250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.64186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.64245: stderr chunk (state=3): >>><<< 22690 1727204240.64249: stdout chunk (state=3): >>><<< 22690 1727204240.64262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.64272: handler run complete 22690 1727204240.64295: Evaluated conditional (False): False 22690 1727204240.64307: attempt loop complete, returning result 22690 1727204240.64311: _execute() done 22690 1727204240.64313: dumping result to json 22690 1727204240.64320: done dumping result, returning 22690 1727204240.64327: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [127b8e07-fff9-78bb-bf56-000000000193] 22690 1727204240.64333: sending task result for task 127b8e07-fff9-78bb-bf56-000000000193 22690 1727204240.64450: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000193 22690 1727204240.64453: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003595", "end": "2024-09-24 14:57:20.596678", "rc": 0, "start": "2024-09-24 14:57:20.593083" } STDOUT: bonding_masters eth0 lo 22690 1727204240.64656: no more pending results, returning what we have 22690 1727204240.64660: results queue empty 22690 1727204240.64661: checking for any_errors_fatal 22690 1727204240.64663: done checking for any_errors_fatal 22690 1727204240.64663: checking for max_fail_percentage 22690 1727204240.64667: done checking for max_fail_percentage 22690 1727204240.64668: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.64669: done checking to see if all hosts have failed 22690 1727204240.64670: getting the remaining hosts for this loop 22690 1727204240.64671: done getting the remaining hosts for this loop 22690 1727204240.64675: getting the next task for host managed-node2 22690 1727204240.64682: done getting next task for host managed-node2 22690 1727204240.64684: ^ task is: TASK: Set current_interfaces 22690 1727204240.64696: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.64701: getting variables 22690 1727204240.64702: in VariableManager get_vars() 22690 1727204240.64807: Calling all_inventory to load vars for managed-node2 22690 1727204240.64810: Calling groups_inventory to load vars for managed-node2 22690 1727204240.64814: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.64828: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.64830: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.64833: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.65007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.65237: done with get_vars() 22690 1727204240.65250: done getting variables 22690 1727204240.65335: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.398) 0:00:07.937 ***** 22690 1727204240.65386: entering _queue_task() for managed-node2/set_fact 22690 1727204240.66004: worker is 1 (out of 1 available) 22690 1727204240.66019: exiting _queue_task() for managed-node2/set_fact 22690 1727204240.66033: done queuing things up, now waiting for results queue to drain 22690 1727204240.66034: waiting for pending results... 22690 1727204240.66283: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 22690 1727204240.66289: in run() - task 127b8e07-fff9-78bb-bf56-000000000194 22690 1727204240.66376: variable 'ansible_search_path' from source: unknown 22690 1727204240.66384: variable 'ansible_search_path' from source: unknown 22690 1727204240.66388: calling self._execute() 22690 1727204240.66483: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.66501: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.66522: variable 'omit' from source: magic vars 22690 1727204240.66993: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.67017: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.67071: variable 'omit' from source: magic vars 22690 1727204240.67118: variable 'omit' from source: magic vars 22690 1727204240.67282: variable '_current_interfaces' from source: set_fact 22690 1727204240.67371: variable 'omit' from source: magic vars 22690 1727204240.67476: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.67486: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.67519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.67573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.67606: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.67648: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.67695: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.67703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.67804: Set connection var ansible_connection to ssh 22690 1727204240.67831: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.67845: Set connection var ansible_pipelining to False 22690 1727204240.67852: Set connection var ansible_shell_type to sh 22690 1727204240.67913: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.67921: Set connection var ansible_timeout to 10 22690 1727204240.67923: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.67926: variable 'ansible_connection' from source: unknown 22690 1727204240.68025: variable 'ansible_module_compression' from source: unknown 22690 1727204240.68029: variable 'ansible_shell_type' from source: unknown 22690 1727204240.68031: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.68033: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.68037: variable 'ansible_pipelining' from source: unknown 22690 1727204240.68040: variable 'ansible_timeout' from source: unknown 22690 1727204240.68042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.68170: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.68189: variable 'omit' from source: magic vars 22690 1727204240.68200: starting attempt loop 22690 1727204240.68208: running the handler 22690 1727204240.68262: handler run complete 22690 1727204240.68272: attempt loop complete, returning result 22690 1727204240.68279: _execute() done 22690 1727204240.68288: dumping result to json 22690 1727204240.68354: done dumping result, returning 22690 1727204240.68359: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [127b8e07-fff9-78bb-bf56-000000000194] 22690 1727204240.68361: sending task result for task 127b8e07-fff9-78bb-bf56-000000000194 22690 1727204240.68448: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000194 22690 1727204240.68451: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22690 1727204240.68536: no more pending results, returning what we have 22690 1727204240.68541: results queue empty 22690 1727204240.68542: checking for any_errors_fatal 22690 1727204240.68551: done checking for any_errors_fatal 22690 1727204240.68552: checking for max_fail_percentage 22690 1727204240.68554: done checking for max_fail_percentage 22690 1727204240.68555: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.68556: done checking to see if all hosts have failed 22690 1727204240.68557: getting the remaining hosts for this loop 22690 1727204240.68558: done getting the remaining hosts for this loop 22690 1727204240.68671: getting the next task for host managed-node2 22690 1727204240.68685: done getting next task for host managed-node2 22690 1727204240.68688: ^ task is: TASK: Show current_interfaces 22690 1727204240.68693: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.68698: getting variables 22690 1727204240.68700: in VariableManager get_vars() 22690 1727204240.68736: Calling all_inventory to load vars for managed-node2 22690 1727204240.68740: Calling groups_inventory to load vars for managed-node2 22690 1727204240.68744: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.68759: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.68762: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.68900: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.69176: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.69329: done with get_vars() 22690 1727204240.69338: done getting variables 22690 1727204240.69388: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.040) 0:00:07.977 ***** 22690 1727204240.69415: entering _queue_task() for managed-node2/debug 22690 1727204240.69697: worker is 1 (out of 1 available) 22690 1727204240.69712: exiting _queue_task() for managed-node2/debug 22690 1727204240.69726: done queuing things up, now waiting for results queue to drain 22690 1727204240.69728: waiting for pending results... 22690 1727204240.69904: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 22690 1727204240.69980: in run() - task 127b8e07-fff9-78bb-bf56-00000000015d 22690 1727204240.69992: variable 'ansible_search_path' from source: unknown 22690 1727204240.69995: variable 'ansible_search_path' from source: unknown 22690 1727204240.70033: calling self._execute() 22690 1727204240.70103: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.70107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.70118: variable 'omit' from source: magic vars 22690 1727204240.70417: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.70430: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.70437: variable 'omit' from source: magic vars 22690 1727204240.70471: variable 'omit' from source: magic vars 22690 1727204240.70554: variable 'current_interfaces' from source: set_fact 22690 1727204240.70576: variable 'omit' from source: magic vars 22690 1727204240.70612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.70646: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.70664: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.70680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.70692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.70720: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.70725: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.70728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.70823: Set connection var ansible_connection to ssh 22690 1727204240.70832: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.70842: Set connection var ansible_pipelining to False 22690 1727204240.70845: Set connection var ansible_shell_type to sh 22690 1727204240.70848: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.70856: Set connection var ansible_timeout to 10 22690 1727204240.70876: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.70880: variable 'ansible_connection' from source: unknown 22690 1727204240.70883: variable 'ansible_module_compression' from source: unknown 22690 1727204240.70886: variable 'ansible_shell_type' from source: unknown 22690 1727204240.70889: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.70891: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.70894: variable 'ansible_pipelining' from source: unknown 22690 1727204240.70897: variable 'ansible_timeout' from source: unknown 22690 1727204240.70903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.71018: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.71029: variable 'omit' from source: magic vars 22690 1727204240.71034: starting attempt loop 22690 1727204240.71037: running the handler 22690 1727204240.71274: handler run complete 22690 1727204240.71278: attempt loop complete, returning result 22690 1727204240.71280: _execute() done 22690 1727204240.71282: dumping result to json 22690 1727204240.71285: done dumping result, returning 22690 1727204240.71287: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [127b8e07-fff9-78bb-bf56-00000000015d] 22690 1727204240.71289: sending task result for task 127b8e07-fff9-78bb-bf56-00000000015d 22690 1727204240.71359: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000015d 22690 1727204240.71362: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22690 1727204240.71426: no more pending results, returning what we have 22690 1727204240.71429: results queue empty 22690 1727204240.71430: checking for any_errors_fatal 22690 1727204240.71436: done checking for any_errors_fatal 22690 1727204240.71437: checking for max_fail_percentage 22690 1727204240.71439: done checking for max_fail_percentage 22690 1727204240.71440: checking to see if all hosts have failed and the running result is not ok 22690 1727204240.71440: done checking to see if all hosts have failed 22690 1727204240.71441: getting the remaining hosts for this loop 22690 1727204240.71442: done getting the remaining hosts for this loop 22690 1727204240.71446: getting the next task for host managed-node2 22690 1727204240.71453: done getting next task for host managed-node2 22690 1727204240.71455: ^ task is: TASK: Install iproute 22690 1727204240.71458: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204240.71462: getting variables 22690 1727204240.71463: in VariableManager get_vars() 22690 1727204240.71492: Calling all_inventory to load vars for managed-node2 22690 1727204240.71495: Calling groups_inventory to load vars for managed-node2 22690 1727204240.71498: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204240.71510: Calling all_plugins_play to load vars for managed-node2 22690 1727204240.71512: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204240.71517: Calling groups_plugins_play to load vars for managed-node2 22690 1727204240.71639: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204240.71761: done with get_vars() 22690 1727204240.71771: done getting variables 22690 1727204240.71818: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:57:20 -0400 (0:00:00.024) 0:00:08.001 ***** 22690 1727204240.71844: entering _queue_task() for managed-node2/package 22690 1727204240.72093: worker is 1 (out of 1 available) 22690 1727204240.72112: exiting _queue_task() for managed-node2/package 22690 1727204240.72129: done queuing things up, now waiting for results queue to drain 22690 1727204240.72131: waiting for pending results... 22690 1727204240.72485: running TaskExecutor() for managed-node2/TASK: Install iproute 22690 1727204240.72493: in run() - task 127b8e07-fff9-78bb-bf56-000000000134 22690 1727204240.72497: variable 'ansible_search_path' from source: unknown 22690 1727204240.72500: variable 'ansible_search_path' from source: unknown 22690 1727204240.72504: calling self._execute() 22690 1727204240.72574: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.72586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.72615: variable 'omit' from source: magic vars 22690 1727204240.72981: variable 'ansible_distribution_major_version' from source: facts 22690 1727204240.72992: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204240.72998: variable 'omit' from source: magic vars 22690 1727204240.73029: variable 'omit' from source: magic vars 22690 1727204240.73183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204240.75038: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204240.75087: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204240.75129: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204240.75370: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204240.75374: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204240.75377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204240.75380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204240.75382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204240.75417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204240.75437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204240.75548: variable '__network_is_ostree' from source: set_fact 22690 1727204240.75559: variable 'omit' from source: magic vars 22690 1727204240.75600: variable 'omit' from source: magic vars 22690 1727204240.75635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204240.75674: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204240.75699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204240.75742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.75758: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204240.75797: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204240.75806: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.75814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.75918: Set connection var ansible_connection to ssh 22690 1727204240.75936: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204240.75950: Set connection var ansible_pipelining to False 22690 1727204240.75959: Set connection var ansible_shell_type to sh 22690 1727204240.75973: Set connection var ansible_shell_executable to /bin/sh 22690 1727204240.75986: Set connection var ansible_timeout to 10 22690 1727204240.76021: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.76031: variable 'ansible_connection' from source: unknown 22690 1727204240.76038: variable 'ansible_module_compression' from source: unknown 22690 1727204240.76045: variable 'ansible_shell_type' from source: unknown 22690 1727204240.76052: variable 'ansible_shell_executable' from source: unknown 22690 1727204240.76058: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204240.76068: variable 'ansible_pipelining' from source: unknown 22690 1727204240.76076: variable 'ansible_timeout' from source: unknown 22690 1727204240.76083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204240.76196: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204240.76217: variable 'omit' from source: magic vars 22690 1727204240.76318: starting attempt loop 22690 1727204240.76321: running the handler 22690 1727204240.76324: variable 'ansible_facts' from source: unknown 22690 1727204240.76326: variable 'ansible_facts' from source: unknown 22690 1727204240.76328: _low_level_execute_command(): starting 22690 1727204240.76330: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204240.77000: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.77022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204240.77079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204240.77102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.77172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.78982: stdout chunk (state=3): >>>/root <<< 22690 1727204240.79474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.79478: stderr chunk (state=3): >>><<< 22690 1727204240.79480: stdout chunk (state=3): >>><<< 22690 1727204240.79484: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.79497: _low_level_execute_command(): starting 22690 1727204240.79500: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979 `" && echo ansible-tmp-1727204240.7934232-23272-159473245522979="` echo /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979 `" ) && sleep 0' 22690 1727204240.80841: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204240.81300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204240.81332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204240.83321: stdout chunk (state=3): >>>ansible-tmp-1727204240.7934232-23272-159473245522979=/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979 <<< 22690 1727204240.83574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204240.83578: stdout chunk (state=3): >>><<< 22690 1727204240.83581: stderr chunk (state=3): >>><<< 22690 1727204240.83584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204240.7934232-23272-159473245522979=/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204240.83610: variable 'ansible_module_compression' from source: unknown 22690 1727204240.83683: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 22690 1727204240.83703: ANSIBALLZ: Acquiring lock 22690 1727204240.83710: ANSIBALLZ: Lock acquired: 139846653776800 22690 1727204240.83720: ANSIBALLZ: Creating module 22690 1727204241.05031: ANSIBALLZ: Writing module into payload 22690 1727204241.05346: ANSIBALLZ: Writing module 22690 1727204241.05383: ANSIBALLZ: Renaming module 22690 1727204241.05395: ANSIBALLZ: Done creating module 22690 1727204241.05429: variable 'ansible_facts' from source: unknown 22690 1727204241.05540: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py 22690 1727204241.05813: Sending initial data 22690 1727204241.05824: Sent initial data (152 bytes) 22690 1727204241.07102: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204241.07272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204241.07279: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204241.07282: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204241.07383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204241.07421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204241.07495: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204241.09330: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204241.09436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204241.09461: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpgwenpcaf /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py <<< 22690 1727204241.09466: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py" <<< 22690 1727204241.09601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpgwenpcaf" to remote "/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py" <<< 22690 1727204241.11061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204241.11069: stdout chunk (state=3): >>><<< 22690 1727204241.11072: stderr chunk (state=3): >>><<< 22690 1727204241.11219: done transferring module to remote 22690 1727204241.11224: _low_level_execute_command(): starting 22690 1727204241.11227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/ /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py && sleep 0' 22690 1727204241.12175: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204241.12185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204241.12188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204241.12193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204241.12196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204241.12198: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204241.12200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204241.12260: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204241.12280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204241.12385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204241.14595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204241.14599: stdout chunk (state=3): >>><<< 22690 1727204241.14601: stderr chunk (state=3): >>><<< 22690 1727204241.14603: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204241.14605: _low_level_execute_command(): starting 22690 1727204241.14607: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/AnsiballZ_dnf.py && sleep 0' 22690 1727204241.15385: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204241.15390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204241.15437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204241.15511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.22990: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22690 1727204242.27501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204242.27569: stderr chunk (state=3): >>><<< 22690 1727204242.27574: stdout chunk (state=3): >>><<< 22690 1727204242.27594: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204242.27635: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204242.27645: _low_level_execute_command(): starting 22690 1727204242.27648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204240.7934232-23272-159473245522979/ > /dev/null 2>&1 && sleep 0' 22690 1727204242.28155: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.28159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.28161: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.28166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204242.28168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.28220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.28223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.28226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.28304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.30248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.30308: stderr chunk (state=3): >>><<< 22690 1727204242.30312: stdout chunk (state=3): >>><<< 22690 1727204242.30331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.30338: handler run complete 22690 1727204242.30467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204242.30605: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204242.30637: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204242.30664: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204242.30689: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204242.30747: variable '__install_status' from source: unknown 22690 1727204242.30764: Evaluated conditional (__install_status is success): True 22690 1727204242.30782: attempt loop complete, returning result 22690 1727204242.30786: _execute() done 22690 1727204242.30788: dumping result to json 22690 1727204242.30794: done dumping result, returning 22690 1727204242.30802: done running TaskExecutor() for managed-node2/TASK: Install iproute [127b8e07-fff9-78bb-bf56-000000000134] 22690 1727204242.30807: sending task result for task 127b8e07-fff9-78bb-bf56-000000000134 22690 1727204242.30912: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000134 22690 1727204242.30914: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22690 1727204242.31002: no more pending results, returning what we have 22690 1727204242.31006: results queue empty 22690 1727204242.31007: checking for any_errors_fatal 22690 1727204242.31012: done checking for any_errors_fatal 22690 1727204242.31012: checking for max_fail_percentage 22690 1727204242.31014: done checking for max_fail_percentage 22690 1727204242.31017: checking to see if all hosts have failed and the running result is not ok 22690 1727204242.31019: done checking to see if all hosts have failed 22690 1727204242.31019: getting the remaining hosts for this loop 22690 1727204242.31021: done getting the remaining hosts for this loop 22690 1727204242.31025: getting the next task for host managed-node2 22690 1727204242.31031: done getting next task for host managed-node2 22690 1727204242.31034: ^ task is: TASK: Create veth interface {{ interface }} 22690 1727204242.31036: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204242.31040: getting variables 22690 1727204242.31041: in VariableManager get_vars() 22690 1727204242.31073: Calling all_inventory to load vars for managed-node2 22690 1727204242.31076: Calling groups_inventory to load vars for managed-node2 22690 1727204242.31079: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204242.31091: Calling all_plugins_play to load vars for managed-node2 22690 1727204242.31094: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204242.31097: Calling groups_plugins_play to load vars for managed-node2 22690 1727204242.31298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204242.31422: done with get_vars() 22690 1727204242.31430: done getting variables 22690 1727204242.31477: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204242.31571: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:57:22 -0400 (0:00:01.597) 0:00:09.599 ***** 22690 1727204242.31595: entering _queue_task() for managed-node2/command 22690 1727204242.31834: worker is 1 (out of 1 available) 22690 1727204242.31849: exiting _queue_task() for managed-node2/command 22690 1727204242.31861: done queuing things up, now waiting for results queue to drain 22690 1727204242.31863: waiting for pending results... 22690 1727204242.32025: running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 22690 1727204242.32098: in run() - task 127b8e07-fff9-78bb-bf56-000000000135 22690 1727204242.32110: variable 'ansible_search_path' from source: unknown 22690 1727204242.32118: variable 'ansible_search_path' from source: unknown 22690 1727204242.32331: variable 'interface' from source: set_fact 22690 1727204242.32395: variable 'interface' from source: set_fact 22690 1727204242.32571: variable 'interface' from source: set_fact 22690 1727204242.32663: Loaded config def from plugin (lookup/items) 22690 1727204242.32680: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22690 1727204242.32715: variable 'omit' from source: magic vars 22690 1727204242.32833: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.32840: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.32853: variable 'omit' from source: magic vars 22690 1727204242.33032: variable 'ansible_distribution_major_version' from source: facts 22690 1727204242.33040: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204242.33185: variable 'type' from source: set_fact 22690 1727204242.33190: variable 'state' from source: include params 22690 1727204242.33193: variable 'interface' from source: set_fact 22690 1727204242.33201: variable 'current_interfaces' from source: set_fact 22690 1727204242.33208: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22690 1727204242.33211: variable 'omit' from source: magic vars 22690 1727204242.33241: variable 'omit' from source: magic vars 22690 1727204242.33275: variable 'item' from source: unknown 22690 1727204242.33331: variable 'item' from source: unknown 22690 1727204242.33345: variable 'omit' from source: magic vars 22690 1727204242.33374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204242.33401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204242.33420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204242.33436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204242.33447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204242.33473: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204242.33476: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.33479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.33555: Set connection var ansible_connection to ssh 22690 1727204242.33563: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204242.33572: Set connection var ansible_pipelining to False 22690 1727204242.33576: Set connection var ansible_shell_type to sh 22690 1727204242.33581: Set connection var ansible_shell_executable to /bin/sh 22690 1727204242.33588: Set connection var ansible_timeout to 10 22690 1727204242.33605: variable 'ansible_shell_executable' from source: unknown 22690 1727204242.33609: variable 'ansible_connection' from source: unknown 22690 1727204242.33613: variable 'ansible_module_compression' from source: unknown 22690 1727204242.33618: variable 'ansible_shell_type' from source: unknown 22690 1727204242.33621: variable 'ansible_shell_executable' from source: unknown 22690 1727204242.33623: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.33627: variable 'ansible_pipelining' from source: unknown 22690 1727204242.33629: variable 'ansible_timeout' from source: unknown 22690 1727204242.33632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.33738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204242.33751: variable 'omit' from source: magic vars 22690 1727204242.33756: starting attempt loop 22690 1727204242.33759: running the handler 22690 1727204242.33773: _low_level_execute_command(): starting 22690 1727204242.33781: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204242.34348: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.34353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.34356: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.34359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.34424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.34427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.34436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.34500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.36172: stdout chunk (state=3): >>>/root <<< 22690 1727204242.36480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.36485: stdout chunk (state=3): >>><<< 22690 1727204242.36487: stderr chunk (state=3): >>><<< 22690 1727204242.36491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.36502: _low_level_execute_command(): starting 22690 1727204242.36505: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890 `" && echo ansible-tmp-1727204242.3638344-23357-159199646375890="` echo /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890 `" ) && sleep 0' 22690 1727204242.37152: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.37175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.37256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.37310: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.37328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.37363: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.37483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.39478: stdout chunk (state=3): >>>ansible-tmp-1727204242.3638344-23357-159199646375890=/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890 <<< 22690 1727204242.39669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.39704: stderr chunk (state=3): >>><<< 22690 1727204242.39708: stdout chunk (state=3): >>><<< 22690 1727204242.39730: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204242.3638344-23357-159199646375890=/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.39862: variable 'ansible_module_compression' from source: unknown 22690 1727204242.39865: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204242.39899: variable 'ansible_facts' from source: unknown 22690 1727204242.40007: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py 22690 1727204242.40286: Sending initial data 22690 1727204242.40303: Sent initial data (156 bytes) 22690 1727204242.41029: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.41071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.41188: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.42803: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204242.42882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204242.42955: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpp5wbkht8 /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py <<< 22690 1727204242.42959: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py" <<< 22690 1727204242.43060: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpp5wbkht8" to remote "/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py" <<< 22690 1727204242.44039: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.44138: stderr chunk (state=3): >>><<< 22690 1727204242.44145: stdout chunk (state=3): >>><<< 22690 1727204242.44148: done transferring module to remote 22690 1727204242.44150: _low_level_execute_command(): starting 22690 1727204242.44153: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/ /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py && sleep 0' 22690 1727204242.44880: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.44908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.45009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.45033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.45075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.45094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.45129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.45249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.47159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.47164: stdout chunk (state=3): >>><<< 22690 1727204242.47169: stderr chunk (state=3): >>><<< 22690 1727204242.47195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.47290: _low_level_execute_command(): starting 22690 1727204242.47294: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/AnsiballZ_command.py && sleep 0' 22690 1727204242.47999: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.48022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.48040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.48061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.48209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.65507: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:22.644810", "end": "2024-09-24 14:57:22.649922", "delta": "0:00:00.005112", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204242.68017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204242.68033: stdout chunk (state=3): >>><<< 22690 1727204242.68048: stderr chunk (state=3): >>><<< 22690 1727204242.68076: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:22.644810", "end": "2024-09-24 14:57:22.649922", "delta": "0:00:00.005112", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204242.68128: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204242.68227: _low_level_execute_command(): starting 22690 1727204242.68230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204242.3638344-23357-159199646375890/ > /dev/null 2>&1 && sleep 0' 22690 1727204242.68822: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.68840: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.68854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.68878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204242.68896: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204242.68910: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204242.68925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.68944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204242.68958: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204242.69064: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204242.69071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.69093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.69193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.73712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.73730: stdout chunk (state=3): >>><<< 22690 1727204242.73742: stderr chunk (state=3): >>><<< 22690 1727204242.73763: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.73778: handler run complete 22690 1727204242.73807: Evaluated conditional (False): False 22690 1727204242.73825: attempt loop complete, returning result 22690 1727204242.73855: variable 'item' from source: unknown 22690 1727204242.73954: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005112", "end": "2024-09-24 14:57:22.649922", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-24 14:57:22.644810" } 22690 1727204242.74477: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.74481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.74483: variable 'omit' from source: magic vars 22690 1727204242.74486: variable 'ansible_distribution_major_version' from source: facts 22690 1727204242.74488: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204242.74735: variable 'type' from source: set_fact 22690 1727204242.74746: variable 'state' from source: include params 22690 1727204242.74754: variable 'interface' from source: set_fact 22690 1727204242.74762: variable 'current_interfaces' from source: set_fact 22690 1727204242.74803: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22690 1727204242.74807: variable 'omit' from source: magic vars 22690 1727204242.74809: variable 'omit' from source: magic vars 22690 1727204242.74859: variable 'item' from source: unknown 22690 1727204242.74943: variable 'item' from source: unknown 22690 1727204242.74970: variable 'omit' from source: magic vars 22690 1727204242.75020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204242.75023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204242.75026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204242.75041: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204242.75048: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.75055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.75238: Set connection var ansible_connection to ssh 22690 1727204242.75241: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204242.75244: Set connection var ansible_pipelining to False 22690 1727204242.75246: Set connection var ansible_shell_type to sh 22690 1727204242.75247: Set connection var ansible_shell_executable to /bin/sh 22690 1727204242.75249: Set connection var ansible_timeout to 10 22690 1727204242.75251: variable 'ansible_shell_executable' from source: unknown 22690 1727204242.75253: variable 'ansible_connection' from source: unknown 22690 1727204242.75255: variable 'ansible_module_compression' from source: unknown 22690 1727204242.75257: variable 'ansible_shell_type' from source: unknown 22690 1727204242.75259: variable 'ansible_shell_executable' from source: unknown 22690 1727204242.75261: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204242.75263: variable 'ansible_pipelining' from source: unknown 22690 1727204242.75265: variable 'ansible_timeout' from source: unknown 22690 1727204242.75269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204242.75376: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204242.75391: variable 'omit' from source: magic vars 22690 1727204242.75400: starting attempt loop 22690 1727204242.75407: running the handler 22690 1727204242.75418: _low_level_execute_command(): starting 22690 1727204242.75427: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204242.76115: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.76135: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.76188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.76211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204242.76281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.76313: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.76341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.76368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.76473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.78184: stdout chunk (state=3): >>>/root <<< 22690 1727204242.78401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.78406: stdout chunk (state=3): >>><<< 22690 1727204242.78409: stderr chunk (state=3): >>><<< 22690 1727204242.78520: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.78524: _low_level_execute_command(): starting 22690 1727204242.78527: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272 `" && echo ansible-tmp-1727204242.784294-23357-1569259721272="` echo /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272 `" ) && sleep 0' 22690 1727204242.79140: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.79154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.79167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.79186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204242.79200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204242.79220: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204242.79232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.79333: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204242.79346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.79371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.79482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.81459: stdout chunk (state=3): >>>ansible-tmp-1727204242.784294-23357-1569259721272=/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272 <<< 22690 1727204242.81881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.81892: stdout chunk (state=3): >>><<< 22690 1727204242.81904: stderr chunk (state=3): >>><<< 22690 1727204242.81929: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204242.784294-23357-1569259721272=/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.81960: variable 'ansible_module_compression' from source: unknown 22690 1727204242.82007: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204242.82072: variable 'ansible_facts' from source: unknown 22690 1727204242.82120: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py 22690 1727204242.82295: Sending initial data 22690 1727204242.82302: Sent initial data (153 bytes) 22690 1727204242.82961: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.83086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.83143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.83219: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.84848: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204242.84929: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204242.84990: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpimp4bun1 /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py <<< 22690 1727204242.84993: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py" <<< 22690 1727204242.85091: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpimp4bun1" to remote "/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py" <<< 22690 1727204242.86175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.86179: stderr chunk (state=3): >>><<< 22690 1727204242.86182: stdout chunk (state=3): >>><<< 22690 1727204242.86184: done transferring module to remote 22690 1727204242.86188: _low_level_execute_command(): starting 22690 1727204242.86191: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/ /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py && sleep 0' 22690 1727204242.86849: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204242.86880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.86898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204242.86923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204242.86942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204242.86955: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204242.86974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.86993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204242.87006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204242.87020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204242.87035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204242.87126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.87150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.87258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204242.93134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204242.93208: stderr chunk (state=3): >>><<< 22690 1727204242.93222: stdout chunk (state=3): >>><<< 22690 1727204242.93248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204242.93257: _low_level_execute_command(): starting 22690 1727204242.93270: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/AnsiballZ_command.py && sleep 0' 22690 1727204242.93989: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204242.94100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204242.94128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204242.94248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.11229: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:23.106842", "end": "2024-09-24 14:57:23.110736", "delta": "0:00:00.003894", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204243.12854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204243.12858: stdout chunk (state=3): >>><<< 22690 1727204243.12861: stderr chunk (state=3): >>><<< 22690 1727204243.12884: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:23.106842", "end": "2024-09-24 14:57:23.110736", "delta": "0:00:00.003894", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204243.12961: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204243.12968: _low_level_execute_command(): starting 22690 1727204243.12971: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204242.784294-23357-1569259721272/ > /dev/null 2>&1 && sleep 0' 22690 1727204243.13874: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.13892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.13911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.13944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204243.14038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.14073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.14090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.14111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.14214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.16256: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.16260: stdout chunk (state=3): >>><<< 22690 1727204243.16262: stderr chunk (state=3): >>><<< 22690 1727204243.16287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.16303: handler run complete 22690 1727204243.16470: Evaluated conditional (False): False 22690 1727204243.16474: attempt loop complete, returning result 22690 1727204243.16476: variable 'item' from source: unknown 22690 1727204243.16478: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003894", "end": "2024-09-24 14:57:23.110736", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-24 14:57:23.106842" } 22690 1727204243.16810: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.16814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.16819: variable 'omit' from source: magic vars 22690 1727204243.16952: variable 'ansible_distribution_major_version' from source: facts 22690 1727204243.17047: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204243.17195: variable 'type' from source: set_fact 22690 1727204243.17207: variable 'state' from source: include params 22690 1727204243.17219: variable 'interface' from source: set_fact 22690 1727204243.17229: variable 'current_interfaces' from source: set_fact 22690 1727204243.17240: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22690 1727204243.17249: variable 'omit' from source: magic vars 22690 1727204243.17278: variable 'omit' from source: magic vars 22690 1727204243.17331: variable 'item' from source: unknown 22690 1727204243.17410: variable 'item' from source: unknown 22690 1727204243.17435: variable 'omit' from source: magic vars 22690 1727204243.17463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204243.17486: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204243.17500: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204243.17594: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204243.17599: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.17601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.17634: Set connection var ansible_connection to ssh 22690 1727204243.17650: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204243.17664: Set connection var ansible_pipelining to False 22690 1727204243.17674: Set connection var ansible_shell_type to sh 22690 1727204243.17685: Set connection var ansible_shell_executable to /bin/sh 22690 1727204243.17705: Set connection var ansible_timeout to 10 22690 1727204243.17734: variable 'ansible_shell_executable' from source: unknown 22690 1727204243.17742: variable 'ansible_connection' from source: unknown 22690 1727204243.17749: variable 'ansible_module_compression' from source: unknown 22690 1727204243.17757: variable 'ansible_shell_type' from source: unknown 22690 1727204243.17764: variable 'ansible_shell_executable' from source: unknown 22690 1727204243.17774: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.17782: variable 'ansible_pipelining' from source: unknown 22690 1727204243.17811: variable 'ansible_timeout' from source: unknown 22690 1727204243.17814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.17926: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204243.17972: variable 'omit' from source: magic vars 22690 1727204243.17975: starting attempt loop 22690 1727204243.17977: running the handler 22690 1727204243.17980: _low_level_execute_command(): starting 22690 1727204243.17982: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204243.18711: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.18733: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.18750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.18778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204243.18899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.18904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.18935: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.19054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.20718: stdout chunk (state=3): >>>/root <<< 22690 1727204243.20914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.20945: stdout chunk (state=3): >>><<< 22690 1727204243.20948: stderr chunk (state=3): >>><<< 22690 1727204243.21057: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.21061: _low_level_execute_command(): starting 22690 1727204243.21064: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945 `" && echo ansible-tmp-1727204243.209646-23357-149178596465945="` echo /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945 `" ) && sleep 0' 22690 1727204243.21687: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.21705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.21824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.21840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.21854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.21878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.22002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.23997: stdout chunk (state=3): >>>ansible-tmp-1727204243.209646-23357-149178596465945=/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945 <<< 22690 1727204243.24183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.24200: stdout chunk (state=3): >>><<< 22690 1727204243.24215: stderr chunk (state=3): >>><<< 22690 1727204243.24238: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.209646-23357-149178596465945=/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.24269: variable 'ansible_module_compression' from source: unknown 22690 1727204243.24378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204243.24381: variable 'ansible_facts' from source: unknown 22690 1727204243.24413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py 22690 1727204243.24590: Sending initial data 22690 1727204243.24609: Sent initial data (155 bytes) 22690 1727204243.25360: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.25476: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.25715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.25835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.27459: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204243.27567: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204243.27641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpspcxv0pt /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py <<< 22690 1727204243.27645: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py" <<< 22690 1727204243.27712: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpspcxv0pt" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py" <<< 22690 1727204243.28588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.28653: stderr chunk (state=3): >>><<< 22690 1727204243.28662: stdout chunk (state=3): >>><<< 22690 1727204243.28696: done transferring module to remote 22690 1727204243.28708: _low_level_execute_command(): starting 22690 1727204243.28788: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/ /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py && sleep 0' 22690 1727204243.29475: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.29496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.29553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.29573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.29606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.29717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.31641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.31646: stdout chunk (state=3): >>><<< 22690 1727204243.31649: stderr chunk (state=3): >>><<< 22690 1727204243.31762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.31767: _low_level_execute_command(): starting 22690 1727204243.31771: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/AnsiballZ_command.py && sleep 0' 22690 1727204243.32495: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.32513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.32533: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.32552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.32667: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.49684: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:23.490628", "end": "2024-09-24 14:57:23.494717", "delta": "0:00:00.004089", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204243.51306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204243.51309: stdout chunk (state=3): >>><<< 22690 1727204243.51312: stderr chunk (state=3): >>><<< 22690 1727204243.51463: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:23.490628", "end": "2024-09-24 14:57:23.494717", "delta": "0:00:00.004089", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204243.51475: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204243.51479: _low_level_execute_command(): starting 22690 1727204243.51481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.209646-23357-149178596465945/ > /dev/null 2>&1 && sleep 0' 22690 1727204243.52138: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204243.52273: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.52294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.52315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.52431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.54545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.54878: stderr chunk (state=3): >>><<< 22690 1727204243.54883: stdout chunk (state=3): >>><<< 22690 1727204243.54886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.54889: handler run complete 22690 1727204243.54892: Evaluated conditional (False): False 22690 1727204243.54894: attempt loop complete, returning result 22690 1727204243.54896: variable 'item' from source: unknown 22690 1727204243.54972: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.004089", "end": "2024-09-24 14:57:23.494717", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-24 14:57:23.490628" } 22690 1727204243.55476: dumping result to json 22690 1727204243.55483: done dumping result, returning 22690 1727204243.55487: done running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 [127b8e07-fff9-78bb-bf56-000000000135] 22690 1727204243.55489: sending task result for task 127b8e07-fff9-78bb-bf56-000000000135 22690 1727204243.55660: no more pending results, returning what we have 22690 1727204243.55664: results queue empty 22690 1727204243.55667: checking for any_errors_fatal 22690 1727204243.55673: done checking for any_errors_fatal 22690 1727204243.55674: checking for max_fail_percentage 22690 1727204243.55676: done checking for max_fail_percentage 22690 1727204243.55677: checking to see if all hosts have failed and the running result is not ok 22690 1727204243.55679: done checking to see if all hosts have failed 22690 1727204243.55680: getting the remaining hosts for this loop 22690 1727204243.55681: done getting the remaining hosts for this loop 22690 1727204243.55685: getting the next task for host managed-node2 22690 1727204243.55692: done getting next task for host managed-node2 22690 1727204243.55695: ^ task is: TASK: Set up veth as managed by NetworkManager 22690 1727204243.55699: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204243.55703: getting variables 22690 1727204243.55704: in VariableManager get_vars() 22690 1727204243.55739: Calling all_inventory to load vars for managed-node2 22690 1727204243.55744: Calling groups_inventory to load vars for managed-node2 22690 1727204243.55748: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204243.55761: Calling all_plugins_play to load vars for managed-node2 22690 1727204243.55996: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204243.56003: Calling groups_plugins_play to load vars for managed-node2 22690 1727204243.56375: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000135 22690 1727204243.56378: WORKER PROCESS EXITING 22690 1727204243.56404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204243.56741: done with get_vars() 22690 1727204243.56758: done getting variables 22690 1727204243.56830: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:57:23 -0400 (0:00:01.252) 0:00:10.852 ***** 22690 1727204243.56870: entering _queue_task() for managed-node2/command 22690 1727204243.57468: worker is 1 (out of 1 available) 22690 1727204243.57483: exiting _queue_task() for managed-node2/command 22690 1727204243.57495: done queuing things up, now waiting for results queue to drain 22690 1727204243.57497: waiting for pending results... 22690 1727204243.57775: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 22690 1727204243.57918: in run() - task 127b8e07-fff9-78bb-bf56-000000000136 22690 1727204243.57944: variable 'ansible_search_path' from source: unknown 22690 1727204243.57955: variable 'ansible_search_path' from source: unknown 22690 1727204243.58010: calling self._execute() 22690 1727204243.58114: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.58127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.58142: variable 'omit' from source: magic vars 22690 1727204243.58629: variable 'ansible_distribution_major_version' from source: facts 22690 1727204243.58649: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204243.58900: variable 'type' from source: set_fact 22690 1727204243.58905: variable 'state' from source: include params 22690 1727204243.58912: Evaluated conditional (type == 'veth' and state == 'present'): True 22690 1727204243.58921: variable 'omit' from source: magic vars 22690 1727204243.58978: variable 'omit' from source: magic vars 22690 1727204243.59060: variable 'interface' from source: set_fact 22690 1727204243.59074: variable 'omit' from source: magic vars 22690 1727204243.59110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204243.59147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204243.59168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204243.59182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204243.59193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204243.59223: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204243.59226: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.59229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.59304: Set connection var ansible_connection to ssh 22690 1727204243.59314: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204243.59324: Set connection var ansible_pipelining to False 22690 1727204243.59326: Set connection var ansible_shell_type to sh 22690 1727204243.59332: Set connection var ansible_shell_executable to /bin/sh 22690 1727204243.59339: Set connection var ansible_timeout to 10 22690 1727204243.59358: variable 'ansible_shell_executable' from source: unknown 22690 1727204243.59361: variable 'ansible_connection' from source: unknown 22690 1727204243.59364: variable 'ansible_module_compression' from source: unknown 22690 1727204243.59368: variable 'ansible_shell_type' from source: unknown 22690 1727204243.59371: variable 'ansible_shell_executable' from source: unknown 22690 1727204243.59373: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.59383: variable 'ansible_pipelining' from source: unknown 22690 1727204243.59385: variable 'ansible_timeout' from source: unknown 22690 1727204243.59387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.59507: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204243.59515: variable 'omit' from source: magic vars 22690 1727204243.59523: starting attempt loop 22690 1727204243.59526: running the handler 22690 1727204243.59540: _low_level_execute_command(): starting 22690 1727204243.59548: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204243.60104: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.60110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.60119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.60177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.60182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.60185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.60246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.61882: stdout chunk (state=3): >>>/root <<< 22690 1727204243.61995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.62062: stderr chunk (state=3): >>><<< 22690 1727204243.62069: stdout chunk (state=3): >>><<< 22690 1727204243.62091: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.62103: _low_level_execute_command(): starting 22690 1727204243.62109: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535 `" && echo ansible-tmp-1727204243.620905-23409-235209928451535="` echo /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535 `" ) && sleep 0' 22690 1727204243.62600: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.62605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.62621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204243.62624: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.62664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.62675: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.62677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.62747: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.64700: stdout chunk (state=3): >>>ansible-tmp-1727204243.620905-23409-235209928451535=/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535 <<< 22690 1727204243.64813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.64880: stderr chunk (state=3): >>><<< 22690 1727204243.64883: stdout chunk (state=3): >>><<< 22690 1727204243.64901: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.620905-23409-235209928451535=/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.64931: variable 'ansible_module_compression' from source: unknown 22690 1727204243.64977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204243.65008: variable 'ansible_facts' from source: unknown 22690 1727204243.65067: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py 22690 1727204243.65180: Sending initial data 22690 1727204243.65183: Sent initial data (155 bytes) 22690 1727204243.65691: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.65695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.65698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.65700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.65758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.65761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.65842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.67428: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204243.67499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204243.67574: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpc48f1ajy /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py <<< 22690 1727204243.67576: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py" <<< 22690 1727204243.67639: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpc48f1ajy" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py" <<< 22690 1727204243.67646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py" <<< 22690 1727204243.68279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.68358: stderr chunk (state=3): >>><<< 22690 1727204243.68361: stdout chunk (state=3): >>><<< 22690 1727204243.68387: done transferring module to remote 22690 1727204243.68397: _low_level_execute_command(): starting 22690 1727204243.68403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/ /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py && sleep 0' 22690 1727204243.68887: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.68891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.68894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.68896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.68953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.68957: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.68963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.69036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.70841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.70903: stderr chunk (state=3): >>><<< 22690 1727204243.70907: stdout chunk (state=3): >>><<< 22690 1727204243.70960: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.70963: _low_level_execute_command(): starting 22690 1727204243.70967: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/AnsiballZ_command.py && sleep 0' 22690 1727204243.71432: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204243.71436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.71438: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204243.71441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204243.71443: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204243.71503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204243.71508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.71510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.71585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.89986: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:23.878602", "end": "2024-09-24 14:57:23.897747", "delta": "0:00:00.019145", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204243.91708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204243.91712: stdout chunk (state=3): >>><<< 22690 1727204243.91714: stderr chunk (state=3): >>><<< 22690 1727204243.91733: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:23.878602", "end": "2024-09-24 14:57:23.897747", "delta": "0:00:00.019145", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204243.91786: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204243.91813: _low_level_execute_command(): starting 22690 1727204243.91817: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.620905-23409-235209928451535/ > /dev/null 2>&1 && sleep 0' 22690 1727204243.92583: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204243.92613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204243.92717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204243.94614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204243.94686: stderr chunk (state=3): >>><<< 22690 1727204243.94712: stdout chunk (state=3): >>><<< 22690 1727204243.94758: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204243.94762: handler run complete 22690 1727204243.94777: Evaluated conditional (False): False 22690 1727204243.94786: attempt loop complete, returning result 22690 1727204243.94789: _execute() done 22690 1727204243.94791: dumping result to json 22690 1727204243.94798: done dumping result, returning 22690 1727204243.94806: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [127b8e07-fff9-78bb-bf56-000000000136] 22690 1727204243.94811: sending task result for task 127b8e07-fff9-78bb-bf56-000000000136 22690 1727204243.94916: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000136 22690 1727204243.94921: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.019145", "end": "2024-09-24 14:57:23.897747", "rc": 0, "start": "2024-09-24 14:57:23.878602" } 22690 1727204243.94992: no more pending results, returning what we have 22690 1727204243.94996: results queue empty 22690 1727204243.94997: checking for any_errors_fatal 22690 1727204243.95009: done checking for any_errors_fatal 22690 1727204243.95010: checking for max_fail_percentage 22690 1727204243.95012: done checking for max_fail_percentage 22690 1727204243.95013: checking to see if all hosts have failed and the running result is not ok 22690 1727204243.95014: done checking to see if all hosts have failed 22690 1727204243.95015: getting the remaining hosts for this loop 22690 1727204243.95019: done getting the remaining hosts for this loop 22690 1727204243.95023: getting the next task for host managed-node2 22690 1727204243.95029: done getting next task for host managed-node2 22690 1727204243.95033: ^ task is: TASK: Delete veth interface {{ interface }} 22690 1727204243.95036: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204243.95039: getting variables 22690 1727204243.95041: in VariableManager get_vars() 22690 1727204243.95070: Calling all_inventory to load vars for managed-node2 22690 1727204243.95073: Calling groups_inventory to load vars for managed-node2 22690 1727204243.95076: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204243.95087: Calling all_plugins_play to load vars for managed-node2 22690 1727204243.95090: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204243.95093: Calling groups_plugins_play to load vars for managed-node2 22690 1727204243.95235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204243.95357: done with get_vars() 22690 1727204243.95369: done getting variables 22690 1727204243.95441: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204243.95562: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.387) 0:00:11.239 ***** 22690 1727204243.95623: entering _queue_task() for managed-node2/command 22690 1727204243.95908: worker is 1 (out of 1 available) 22690 1727204243.95923: exiting _queue_task() for managed-node2/command 22690 1727204243.95938: done queuing things up, now waiting for results queue to drain 22690 1727204243.95939: waiting for pending results... 22690 1727204243.96293: running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 22690 1727204243.96373: in run() - task 127b8e07-fff9-78bb-bf56-000000000137 22690 1727204243.96378: variable 'ansible_search_path' from source: unknown 22690 1727204243.96380: variable 'ansible_search_path' from source: unknown 22690 1727204243.96397: calling self._execute() 22690 1727204243.96488: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.96504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.96521: variable 'omit' from source: magic vars 22690 1727204243.96920: variable 'ansible_distribution_major_version' from source: facts 22690 1727204243.96941: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204243.97170: variable 'type' from source: set_fact 22690 1727204243.97181: variable 'state' from source: include params 22690 1727204243.97191: variable 'interface' from source: set_fact 22690 1727204243.97371: variable 'current_interfaces' from source: set_fact 22690 1727204243.97375: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 22690 1727204243.97378: when evaluation is False, skipping this task 22690 1727204243.97380: _execute() done 22690 1727204243.97383: dumping result to json 22690 1727204243.97385: done dumping result, returning 22690 1727204243.97388: done running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 [127b8e07-fff9-78bb-bf56-000000000137] 22690 1727204243.97391: sending task result for task 127b8e07-fff9-78bb-bf56-000000000137 22690 1727204243.97464: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000137 22690 1727204243.97469: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22690 1727204243.97526: no more pending results, returning what we have 22690 1727204243.97531: results queue empty 22690 1727204243.97532: checking for any_errors_fatal 22690 1727204243.97546: done checking for any_errors_fatal 22690 1727204243.97547: checking for max_fail_percentage 22690 1727204243.97549: done checking for max_fail_percentage 22690 1727204243.97550: checking to see if all hosts have failed and the running result is not ok 22690 1727204243.97551: done checking to see if all hosts have failed 22690 1727204243.97552: getting the remaining hosts for this loop 22690 1727204243.97553: done getting the remaining hosts for this loop 22690 1727204243.97558: getting the next task for host managed-node2 22690 1727204243.97567: done getting next task for host managed-node2 22690 1727204243.97571: ^ task is: TASK: Create dummy interface {{ interface }} 22690 1727204243.97575: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204243.97579: getting variables 22690 1727204243.97581: in VariableManager get_vars() 22690 1727204243.97619: Calling all_inventory to load vars for managed-node2 22690 1727204243.97622: Calling groups_inventory to load vars for managed-node2 22690 1727204243.97626: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204243.97655: Calling all_plugins_play to load vars for managed-node2 22690 1727204243.97659: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204243.97663: Calling groups_plugins_play to load vars for managed-node2 22690 1727204243.98028: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204243.98537: done with get_vars() 22690 1727204243.98547: done getting variables 22690 1727204243.98612: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204243.98737: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.031) 0:00:11.271 ***** 22690 1727204243.98769: entering _queue_task() for managed-node2/command 22690 1727204243.99100: worker is 1 (out of 1 available) 22690 1727204243.99114: exiting _queue_task() for managed-node2/command 22690 1727204243.99130: done queuing things up, now waiting for results queue to drain 22690 1727204243.99131: waiting for pending results... 22690 1727204243.99587: running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 22690 1727204243.99592: in run() - task 127b8e07-fff9-78bb-bf56-000000000138 22690 1727204243.99596: variable 'ansible_search_path' from source: unknown 22690 1727204243.99599: variable 'ansible_search_path' from source: unknown 22690 1727204243.99642: calling self._execute() 22690 1727204243.99740: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204243.99750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204243.99764: variable 'omit' from source: magic vars 22690 1727204244.00159: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.00180: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.00390: variable 'type' from source: set_fact 22690 1727204244.00394: variable 'state' from source: include params 22690 1727204244.00397: variable 'interface' from source: set_fact 22690 1727204244.00401: variable 'current_interfaces' from source: set_fact 22690 1727204244.00408: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22690 1727204244.00411: when evaluation is False, skipping this task 22690 1727204244.00414: _execute() done 22690 1727204244.00417: dumping result to json 22690 1727204244.00422: done dumping result, returning 22690 1727204244.00428: done running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 [127b8e07-fff9-78bb-bf56-000000000138] 22690 1727204244.00434: sending task result for task 127b8e07-fff9-78bb-bf56-000000000138 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22690 1727204244.00692: no more pending results, returning what we have 22690 1727204244.00696: results queue empty 22690 1727204244.00697: checking for any_errors_fatal 22690 1727204244.00704: done checking for any_errors_fatal 22690 1727204244.00705: checking for max_fail_percentage 22690 1727204244.00707: done checking for max_fail_percentage 22690 1727204244.00708: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.00709: done checking to see if all hosts have failed 22690 1727204244.00710: getting the remaining hosts for this loop 22690 1727204244.00712: done getting the remaining hosts for this loop 22690 1727204244.00719: getting the next task for host managed-node2 22690 1727204244.00726: done getting next task for host managed-node2 22690 1727204244.00728: ^ task is: TASK: Delete dummy interface {{ interface }} 22690 1727204244.00731: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.00736: getting variables 22690 1727204244.00738: in VariableManager get_vars() 22690 1727204244.00872: Calling all_inventory to load vars for managed-node2 22690 1727204244.00876: Calling groups_inventory to load vars for managed-node2 22690 1727204244.00879: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.00886: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000138 22690 1727204244.00889: WORKER PROCESS EXITING 22690 1727204244.00899: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.00902: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.00905: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.01180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.01393: done with get_vars() 22690 1727204244.01406: done getting variables 22690 1727204244.01482: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204244.01599: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.028) 0:00:11.299 ***** 22690 1727204244.01634: entering _queue_task() for managed-node2/command 22690 1727204244.01938: worker is 1 (out of 1 available) 22690 1727204244.01951: exiting _queue_task() for managed-node2/command 22690 1727204244.02168: done queuing things up, now waiting for results queue to drain 22690 1727204244.02171: waiting for pending results... 22690 1727204244.02254: running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 22690 1727204244.02385: in run() - task 127b8e07-fff9-78bb-bf56-000000000139 22690 1727204244.02413: variable 'ansible_search_path' from source: unknown 22690 1727204244.02424: variable 'ansible_search_path' from source: unknown 22690 1727204244.02476: calling self._execute() 22690 1727204244.02577: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.02590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.02610: variable 'omit' from source: magic vars 22690 1727204244.03000: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.03022: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.03257: variable 'type' from source: set_fact 22690 1727204244.03272: variable 'state' from source: include params 22690 1727204244.03283: variable 'interface' from source: set_fact 22690 1727204244.03292: variable 'current_interfaces' from source: set_fact 22690 1727204244.03303: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22690 1727204244.03310: when evaluation is False, skipping this task 22690 1727204244.03371: _execute() done 22690 1727204244.03375: dumping result to json 22690 1727204244.03377: done dumping result, returning 22690 1727204244.03379: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 [127b8e07-fff9-78bb-bf56-000000000139] 22690 1727204244.03381: sending task result for task 127b8e07-fff9-78bb-bf56-000000000139 22690 1727204244.03451: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000139 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22690 1727204244.03529: no more pending results, returning what we have 22690 1727204244.03534: results queue empty 22690 1727204244.03535: checking for any_errors_fatal 22690 1727204244.03544: done checking for any_errors_fatal 22690 1727204244.03544: checking for max_fail_percentage 22690 1727204244.03547: done checking for max_fail_percentage 22690 1727204244.03548: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.03549: done checking to see if all hosts have failed 22690 1727204244.03549: getting the remaining hosts for this loop 22690 1727204244.03551: done getting the remaining hosts for this loop 22690 1727204244.03555: getting the next task for host managed-node2 22690 1727204244.03561: done getting next task for host managed-node2 22690 1727204244.03564: ^ task is: TASK: Create tap interface {{ interface }} 22690 1727204244.03569: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.03572: getting variables 22690 1727204244.03574: in VariableManager get_vars() 22690 1727204244.03605: Calling all_inventory to load vars for managed-node2 22690 1727204244.03608: Calling groups_inventory to load vars for managed-node2 22690 1727204244.03612: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.03631: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.03634: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.03637: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.04132: WORKER PROCESS EXITING 22690 1727204244.04161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.04360: done with get_vars() 22690 1727204244.04375: done getting variables 22690 1727204244.04439: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204244.04567: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.029) 0:00:11.329 ***** 22690 1727204244.04604: entering _queue_task() for managed-node2/command 22690 1727204244.04933: worker is 1 (out of 1 available) 22690 1727204244.04947: exiting _queue_task() for managed-node2/command 22690 1727204244.04960: done queuing things up, now waiting for results queue to drain 22690 1727204244.04961: waiting for pending results... 22690 1727204244.05252: running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 22690 1727204244.05379: in run() - task 127b8e07-fff9-78bb-bf56-00000000013a 22690 1727204244.05405: variable 'ansible_search_path' from source: unknown 22690 1727204244.05413: variable 'ansible_search_path' from source: unknown 22690 1727204244.05462: calling self._execute() 22690 1727204244.05560: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.05577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.05593: variable 'omit' from source: magic vars 22690 1727204244.05999: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.06023: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.06257: variable 'type' from source: set_fact 22690 1727204244.06270: variable 'state' from source: include params 22690 1727204244.06280: variable 'interface' from source: set_fact 22690 1727204244.06288: variable 'current_interfaces' from source: set_fact 22690 1727204244.06299: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22690 1727204244.06306: when evaluation is False, skipping this task 22690 1727204244.06313: _execute() done 22690 1727204244.06323: dumping result to json 22690 1727204244.06334: done dumping result, returning 22690 1727204244.06344: done running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 [127b8e07-fff9-78bb-bf56-00000000013a] 22690 1727204244.06354: sending task result for task 127b8e07-fff9-78bb-bf56-00000000013a 22690 1727204244.06697: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000013a 22690 1727204244.06700: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22690 1727204244.06742: no more pending results, returning what we have 22690 1727204244.06745: results queue empty 22690 1727204244.06746: checking for any_errors_fatal 22690 1727204244.06751: done checking for any_errors_fatal 22690 1727204244.06751: checking for max_fail_percentage 22690 1727204244.06753: done checking for max_fail_percentage 22690 1727204244.06754: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.06755: done checking to see if all hosts have failed 22690 1727204244.06756: getting the remaining hosts for this loop 22690 1727204244.06757: done getting the remaining hosts for this loop 22690 1727204244.06761: getting the next task for host managed-node2 22690 1727204244.06768: done getting next task for host managed-node2 22690 1727204244.06771: ^ task is: TASK: Delete tap interface {{ interface }} 22690 1727204244.06774: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.06778: getting variables 22690 1727204244.06779: in VariableManager get_vars() 22690 1727204244.06807: Calling all_inventory to load vars for managed-node2 22690 1727204244.06810: Calling groups_inventory to load vars for managed-node2 22690 1727204244.06813: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.06827: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.06830: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.06833: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.07024: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.07295: done with get_vars() 22690 1727204244.07307: done getting variables 22690 1727204244.07374: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204244.07497: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.029) 0:00:11.358 ***** 22690 1727204244.07530: entering _queue_task() for managed-node2/command 22690 1727204244.07842: worker is 1 (out of 1 available) 22690 1727204244.07856: exiting _queue_task() for managed-node2/command 22690 1727204244.07973: done queuing things up, now waiting for results queue to drain 22690 1727204244.07975: waiting for pending results... 22690 1727204244.08160: running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 22690 1727204244.08287: in run() - task 127b8e07-fff9-78bb-bf56-00000000013b 22690 1727204244.08312: variable 'ansible_search_path' from source: unknown 22690 1727204244.08322: variable 'ansible_search_path' from source: unknown 22690 1727204244.08370: calling self._execute() 22690 1727204244.08462: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.08477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.08494: variable 'omit' from source: magic vars 22690 1727204244.08888: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.08908: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.09145: variable 'type' from source: set_fact 22690 1727204244.09171: variable 'state' from source: include params 22690 1727204244.09175: variable 'interface' from source: set_fact 22690 1727204244.09177: variable 'current_interfaces' from source: set_fact 22690 1727204244.09225: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22690 1727204244.09229: when evaluation is False, skipping this task 22690 1727204244.09232: _execute() done 22690 1727204244.09234: dumping result to json 22690 1727204244.09237: done dumping result, returning 22690 1727204244.09239: done running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 [127b8e07-fff9-78bb-bf56-00000000013b] 22690 1727204244.09241: sending task result for task 127b8e07-fff9-78bb-bf56-00000000013b 22690 1727204244.09588: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000013b 22690 1727204244.09591: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22690 1727204244.09632: no more pending results, returning what we have 22690 1727204244.09635: results queue empty 22690 1727204244.09636: checking for any_errors_fatal 22690 1727204244.09641: done checking for any_errors_fatal 22690 1727204244.09642: checking for max_fail_percentage 22690 1727204244.09643: done checking for max_fail_percentage 22690 1727204244.09644: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.09645: done checking to see if all hosts have failed 22690 1727204244.09646: getting the remaining hosts for this loop 22690 1727204244.09647: done getting the remaining hosts for this loop 22690 1727204244.09650: getting the next task for host managed-node2 22690 1727204244.09657: done getting next task for host managed-node2 22690 1727204244.09661: ^ task is: TASK: Include the task 'assert_device_present.yml' 22690 1727204244.09663: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.09668: getting variables 22690 1727204244.09670: in VariableManager get_vars() 22690 1727204244.09699: Calling all_inventory to load vars for managed-node2 22690 1727204244.09701: Calling groups_inventory to load vars for managed-node2 22690 1727204244.09705: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.09719: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.09722: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.09726: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.09959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.10221: done with get_vars() 22690 1727204244.10232: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.028) 0:00:11.386 ***** 22690 1727204244.10335: entering _queue_task() for managed-node2/include_tasks 22690 1727204244.10649: worker is 1 (out of 1 available) 22690 1727204244.10664: exiting _queue_task() for managed-node2/include_tasks 22690 1727204244.10680: done queuing things up, now waiting for results queue to drain 22690 1727204244.10681: waiting for pending results... 22690 1727204244.10974: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 22690 1727204244.11095: in run() - task 127b8e07-fff9-78bb-bf56-000000000012 22690 1727204244.11119: variable 'ansible_search_path' from source: unknown 22690 1727204244.11164: calling self._execute() 22690 1727204244.11270: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.11285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.11300: variable 'omit' from source: magic vars 22690 1727204244.11735: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.11760: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.11779: _execute() done 22690 1727204244.11789: dumping result to json 22690 1727204244.11797: done dumping result, returning 22690 1727204244.11808: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [127b8e07-fff9-78bb-bf56-000000000012] 22690 1727204244.11858: sending task result for task 127b8e07-fff9-78bb-bf56-000000000012 22690 1727204244.12100: no more pending results, returning what we have 22690 1727204244.12107: in VariableManager get_vars() 22690 1727204244.12153: Calling all_inventory to load vars for managed-node2 22690 1727204244.12157: Calling groups_inventory to load vars for managed-node2 22690 1727204244.12161: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.12183: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.12187: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.12274: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.12792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.13000: done with get_vars() 22690 1727204244.13009: variable 'ansible_search_path' from source: unknown 22690 1727204244.13274: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000012 22690 1727204244.13278: WORKER PROCESS EXITING 22690 1727204244.13286: we have included files to process 22690 1727204244.13287: generating all_blocks data 22690 1727204244.13290: done generating all_blocks data 22690 1727204244.13295: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22690 1727204244.13296: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22690 1727204244.13299: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22690 1727204244.13591: in VariableManager get_vars() 22690 1727204244.13614: done with get_vars() 22690 1727204244.13997: done processing included file 22690 1727204244.14000: iterating over new_blocks loaded from include file 22690 1727204244.14001: in VariableManager get_vars() 22690 1727204244.14015: done with get_vars() 22690 1727204244.14019: filtering new block on tags 22690 1727204244.14039: done filtering new block on tags 22690 1727204244.14042: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 22690 1727204244.14048: extending task lists for all hosts with included blocks 22690 1727204244.14985: done extending task lists 22690 1727204244.14987: done processing included files 22690 1727204244.14987: results queue empty 22690 1727204244.14988: checking for any_errors_fatal 22690 1727204244.14992: done checking for any_errors_fatal 22690 1727204244.14993: checking for max_fail_percentage 22690 1727204244.14994: done checking for max_fail_percentage 22690 1727204244.14995: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.14996: done checking to see if all hosts have failed 22690 1727204244.14997: getting the remaining hosts for this loop 22690 1727204244.14998: done getting the remaining hosts for this loop 22690 1727204244.15001: getting the next task for host managed-node2 22690 1727204244.15005: done getting next task for host managed-node2 22690 1727204244.15007: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22690 1727204244.15010: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.15013: getting variables 22690 1727204244.15014: in VariableManager get_vars() 22690 1727204244.15029: Calling all_inventory to load vars for managed-node2 22690 1727204244.15032: Calling groups_inventory to load vars for managed-node2 22690 1727204244.15035: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.15042: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.15045: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.15048: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.15207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.15409: done with get_vars() 22690 1727204244.15424: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.051) 0:00:11.438 ***** 22690 1727204244.15512: entering _queue_task() for managed-node2/include_tasks 22690 1727204244.15972: worker is 1 (out of 1 available) 22690 1727204244.15985: exiting _queue_task() for managed-node2/include_tasks 22690 1727204244.15998: done queuing things up, now waiting for results queue to drain 22690 1727204244.16000: waiting for pending results... 22690 1727204244.16192: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 22690 1727204244.16314: in run() - task 127b8e07-fff9-78bb-bf56-0000000001d3 22690 1727204244.16342: variable 'ansible_search_path' from source: unknown 22690 1727204244.16350: variable 'ansible_search_path' from source: unknown 22690 1727204244.16397: calling self._execute() 22690 1727204244.16507: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.16522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.16539: variable 'omit' from source: magic vars 22690 1727204244.17312: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.17342: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.17355: _execute() done 22690 1727204244.17364: dumping result to json 22690 1727204244.17374: done dumping result, returning 22690 1727204244.17388: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-78bb-bf56-0000000001d3] 22690 1727204244.17399: sending task result for task 127b8e07-fff9-78bb-bf56-0000000001d3 22690 1727204244.17571: no more pending results, returning what we have 22690 1727204244.17577: in VariableManager get_vars() 22690 1727204244.17622: Calling all_inventory to load vars for managed-node2 22690 1727204244.17626: Calling groups_inventory to load vars for managed-node2 22690 1727204244.17631: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.17649: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.17653: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.17656: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.18168: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000001d3 22690 1727204244.18173: WORKER PROCESS EXITING 22690 1727204244.18199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.18484: done with get_vars() 22690 1727204244.18493: variable 'ansible_search_path' from source: unknown 22690 1727204244.18495: variable 'ansible_search_path' from source: unknown 22690 1727204244.18574: we have included files to process 22690 1727204244.18576: generating all_blocks data 22690 1727204244.18578: done generating all_blocks data 22690 1727204244.18579: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204244.18580: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204244.18583: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204244.18900: done processing included file 22690 1727204244.18903: iterating over new_blocks loaded from include file 22690 1727204244.18905: in VariableManager get_vars() 22690 1727204244.18921: done with get_vars() 22690 1727204244.18923: filtering new block on tags 22690 1727204244.18939: done filtering new block on tags 22690 1727204244.18941: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 22690 1727204244.18946: extending task lists for all hosts with included blocks 22690 1727204244.19060: done extending task lists 22690 1727204244.19061: done processing included files 22690 1727204244.19062: results queue empty 22690 1727204244.19063: checking for any_errors_fatal 22690 1727204244.19069: done checking for any_errors_fatal 22690 1727204244.19070: checking for max_fail_percentage 22690 1727204244.19071: done checking for max_fail_percentage 22690 1727204244.19072: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.19073: done checking to see if all hosts have failed 22690 1727204244.19074: getting the remaining hosts for this loop 22690 1727204244.19075: done getting the remaining hosts for this loop 22690 1727204244.19077: getting the next task for host managed-node2 22690 1727204244.19081: done getting next task for host managed-node2 22690 1727204244.19083: ^ task is: TASK: Get stat for interface {{ interface }} 22690 1727204244.19087: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.19089: getting variables 22690 1727204244.19090: in VariableManager get_vars() 22690 1727204244.19099: Calling all_inventory to load vars for managed-node2 22690 1727204244.19105: Calling groups_inventory to load vars for managed-node2 22690 1727204244.19108: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.19114: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.19119: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.19122: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.19289: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.19528: done with get_vars() 22690 1727204244.19542: done getting variables 22690 1727204244.19723: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.042) 0:00:11.480 ***** 22690 1727204244.19760: entering _queue_task() for managed-node2/stat 22690 1727204244.20209: worker is 1 (out of 1 available) 22690 1727204244.20228: exiting _queue_task() for managed-node2/stat 22690 1727204244.20241: done queuing things up, now waiting for results queue to drain 22690 1727204244.20242: waiting for pending results... 22690 1727204244.20468: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 22690 1727204244.20598: in run() - task 127b8e07-fff9-78bb-bf56-00000000021e 22690 1727204244.20630: variable 'ansible_search_path' from source: unknown 22690 1727204244.20640: variable 'ansible_search_path' from source: unknown 22690 1727204244.20688: calling self._execute() 22690 1727204244.20790: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.20805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.20822: variable 'omit' from source: magic vars 22690 1727204244.21612: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.21619: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.21622: variable 'omit' from source: magic vars 22690 1727204244.21755: variable 'omit' from source: magic vars 22690 1727204244.21994: variable 'interface' from source: set_fact 22690 1727204244.22252: variable 'omit' from source: magic vars 22690 1727204244.22256: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204244.22291: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204244.22321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204244.22357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.22378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.22472: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204244.22476: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.22479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.22549: Set connection var ansible_connection to ssh 22690 1727204244.22567: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204244.22584: Set connection var ansible_pipelining to False 22690 1727204244.22591: Set connection var ansible_shell_type to sh 22690 1727204244.22604: Set connection var ansible_shell_executable to /bin/sh 22690 1727204244.22620: Set connection var ansible_timeout to 10 22690 1727204244.22649: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.22657: variable 'ansible_connection' from source: unknown 22690 1727204244.22664: variable 'ansible_module_compression' from source: unknown 22690 1727204244.22674: variable 'ansible_shell_type' from source: unknown 22690 1727204244.22923: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.22926: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.22929: variable 'ansible_pipelining' from source: unknown 22690 1727204244.22932: variable 'ansible_timeout' from source: unknown 22690 1727204244.22934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.23139: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204244.23159: variable 'omit' from source: magic vars 22690 1727204244.23174: starting attempt loop 22690 1727204244.23181: running the handler 22690 1727204244.23199: _low_level_execute_command(): starting 22690 1727204244.23214: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204244.24313: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204244.24332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204244.24347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.24393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204244.24413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204244.24426: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204244.24440: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.24463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204244.24480: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204244.24570: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.24608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.24632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.24723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.26756: stdout chunk (state=3): >>>/root <<< 22690 1727204244.26761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.26928: stderr chunk (state=3): >>><<< 22690 1727204244.26932: stdout chunk (state=3): >>><<< 22690 1727204244.26955: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.27112: _low_level_execute_command(): starting 22690 1727204244.27116: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594 `" && echo ansible-tmp-1727204244.270072-23433-153041518809594="` echo /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594 `" ) && sleep 0' 22690 1727204244.28343: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.28384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.28397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.28416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.28524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.30539: stdout chunk (state=3): >>>ansible-tmp-1727204244.270072-23433-153041518809594=/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594 <<< 22690 1727204244.30873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.30877: stdout chunk (state=3): >>><<< 22690 1727204244.30880: stderr chunk (state=3): >>><<< 22690 1727204244.30882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.270072-23433-153041518809594=/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.30885: variable 'ansible_module_compression' from source: unknown 22690 1727204244.30895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22690 1727204244.30940: variable 'ansible_facts' from source: unknown 22690 1727204244.31044: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py 22690 1727204244.31244: Sending initial data 22690 1727204244.31247: Sent initial data (152 bytes) 22690 1727204244.31927: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204244.31990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.32056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.32077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.32107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.32318: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.33851: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22690 1727204244.33862: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 22690 1727204244.33877: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 22690 1727204244.33885: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 22690 1727204244.33892: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 22690 1727204244.33925: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204244.33996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204244.34070: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp5lez36qm /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py <<< 22690 1727204244.34073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py" <<< 22690 1727204244.34132: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp5lez36qm" to remote "/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py" <<< 22690 1727204244.35051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.35348: stderr chunk (state=3): >>><<< 22690 1727204244.35352: stdout chunk (state=3): >>><<< 22690 1727204244.35355: done transferring module to remote 22690 1727204244.35357: _low_level_execute_command(): starting 22690 1727204244.35360: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/ /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py && sleep 0' 22690 1727204244.35880: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204244.35889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204244.35901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.35921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204244.35971: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204244.35975: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204244.35977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.35980: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204244.36054: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.36078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.36187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.38090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.38153: stderr chunk (state=3): >>><<< 22690 1727204244.38161: stdout chunk (state=3): >>><<< 22690 1727204244.38270: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.38276: _low_level_execute_command(): starting 22690 1727204244.38279: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/AnsiballZ_stat.py && sleep 0' 22690 1727204244.38888: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.38940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.38955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.38976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.39086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.55779: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35797, "dev": 23, "nlink": 1, "atime": 1727204242.648541, "mtime": 1727204242.648541, "ctime": 1727204242.648541, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22690 1727204244.57148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204244.57202: stderr chunk (state=3): >>><<< 22690 1727204244.57205: stdout chunk (state=3): >>><<< 22690 1727204244.57224: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35797, "dev": 23, "nlink": 1, "atime": 1727204242.648541, "mtime": 1727204242.648541, "ctime": 1727204242.648541, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204244.57270: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204244.57279: _low_level_execute_command(): starting 22690 1727204244.57285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.270072-23433-153041518809594/ > /dev/null 2>&1 && sleep 0' 22690 1727204244.57774: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.57778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.57785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204244.57787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.57842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.57846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.57850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.57923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.60144: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.60150: stdout chunk (state=3): >>><<< 22690 1727204244.60153: stderr chunk (state=3): >>><<< 22690 1727204244.60392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.60395: handler run complete 22690 1727204244.60397: attempt loop complete, returning result 22690 1727204244.60400: _execute() done 22690 1727204244.60402: dumping result to json 22690 1727204244.60404: done dumping result, returning 22690 1727204244.60407: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [127b8e07-fff9-78bb-bf56-00000000021e] 22690 1727204244.60409: sending task result for task 127b8e07-fff9-78bb-bf56-00000000021e 22690 1727204244.60502: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000021e 22690 1727204244.60505: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204242.648541, "block_size": 4096, "blocks": 0, "ctime": 1727204242.648541, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35797, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1727204242.648541, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 22690 1727204244.60781: no more pending results, returning what we have 22690 1727204244.60786: results queue empty 22690 1727204244.60787: checking for any_errors_fatal 22690 1727204244.60790: done checking for any_errors_fatal 22690 1727204244.60790: checking for max_fail_percentage 22690 1727204244.60792: done checking for max_fail_percentage 22690 1727204244.60793: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.60794: done checking to see if all hosts have failed 22690 1727204244.60795: getting the remaining hosts for this loop 22690 1727204244.60797: done getting the remaining hosts for this loop 22690 1727204244.60802: getting the next task for host managed-node2 22690 1727204244.60811: done getting next task for host managed-node2 22690 1727204244.60815: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 22690 1727204244.60821: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.60826: getting variables 22690 1727204244.60828: in VariableManager get_vars() 22690 1727204244.60862: Calling all_inventory to load vars for managed-node2 22690 1727204244.60865: Calling groups_inventory to load vars for managed-node2 22690 1727204244.61120: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.61136: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.61140: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.61144: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.61796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.62326: done with get_vars() 22690 1727204244.62344: done getting variables 22690 1727204244.62533: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 22690 1727204244.62695: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.429) 0:00:11.910 ***** 22690 1727204244.62735: entering _queue_task() for managed-node2/assert 22690 1727204244.62737: Creating lock for assert 22690 1727204244.63139: worker is 1 (out of 1 available) 22690 1727204244.63155: exiting _queue_task() for managed-node2/assert 22690 1727204244.63172: done queuing things up, now waiting for results queue to drain 22690 1727204244.63174: waiting for pending results... 22690 1727204244.63507: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' 22690 1727204244.63521: in run() - task 127b8e07-fff9-78bb-bf56-0000000001d4 22690 1727204244.63530: variable 'ansible_search_path' from source: unknown 22690 1727204244.63533: variable 'ansible_search_path' from source: unknown 22690 1727204244.63576: calling self._execute() 22690 1727204244.63741: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.63745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.63750: variable 'omit' from source: magic vars 22690 1727204244.64096: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.64107: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.64117: variable 'omit' from source: magic vars 22690 1727204244.64150: variable 'omit' from source: magic vars 22690 1727204244.64233: variable 'interface' from source: set_fact 22690 1727204244.64247: variable 'omit' from source: magic vars 22690 1727204244.64284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204244.64320: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204244.64340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204244.64355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.64368: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.64471: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204244.64474: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.64476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.64572: Set connection var ansible_connection to ssh 22690 1727204244.64594: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204244.64607: Set connection var ansible_pipelining to False 22690 1727204244.64670: Set connection var ansible_shell_type to sh 22690 1727204244.64674: Set connection var ansible_shell_executable to /bin/sh 22690 1727204244.64677: Set connection var ansible_timeout to 10 22690 1727204244.64679: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.64683: variable 'ansible_connection' from source: unknown 22690 1727204244.64687: variable 'ansible_module_compression' from source: unknown 22690 1727204244.64689: variable 'ansible_shell_type' from source: unknown 22690 1727204244.64691: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.64693: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.64706: variable 'ansible_pipelining' from source: unknown 22690 1727204244.64713: variable 'ansible_timeout' from source: unknown 22690 1727204244.64724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.64890: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204244.64920: variable 'omit' from source: magic vars 22690 1727204244.64923: starting attempt loop 22690 1727204244.64972: running the handler 22690 1727204244.65091: variable 'interface_stat' from source: set_fact 22690 1727204244.65119: Evaluated conditional (interface_stat.stat.exists): True 22690 1727204244.65135: handler run complete 22690 1727204244.65154: attempt loop complete, returning result 22690 1727204244.65160: _execute() done 22690 1727204244.65171: dumping result to json 22690 1727204244.65271: done dumping result, returning 22690 1727204244.65275: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' [127b8e07-fff9-78bb-bf56-0000000001d4] 22690 1727204244.65278: sending task result for task 127b8e07-fff9-78bb-bf56-0000000001d4 22690 1727204244.65360: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000001d4 22690 1727204244.65364: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22690 1727204244.65422: no more pending results, returning what we have 22690 1727204244.65426: results queue empty 22690 1727204244.65427: checking for any_errors_fatal 22690 1727204244.65437: done checking for any_errors_fatal 22690 1727204244.65438: checking for max_fail_percentage 22690 1727204244.65439: done checking for max_fail_percentage 22690 1727204244.65440: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.65441: done checking to see if all hosts have failed 22690 1727204244.65442: getting the remaining hosts for this loop 22690 1727204244.65444: done getting the remaining hosts for this loop 22690 1727204244.65448: getting the next task for host managed-node2 22690 1727204244.65457: done getting next task for host managed-node2 22690 1727204244.65460: ^ task is: TASK: meta (flush_handlers) 22690 1727204244.65462: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.65468: getting variables 22690 1727204244.65469: in VariableManager get_vars() 22690 1727204244.65500: Calling all_inventory to load vars for managed-node2 22690 1727204244.65503: Calling groups_inventory to load vars for managed-node2 22690 1727204244.65507: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.65519: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.65522: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.65525: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.65834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.65953: done with get_vars() 22690 1727204244.65961: done getting variables 22690 1727204244.66015: in VariableManager get_vars() 22690 1727204244.66026: Calling all_inventory to load vars for managed-node2 22690 1727204244.66028: Calling groups_inventory to load vars for managed-node2 22690 1727204244.66029: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.66033: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.66035: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.66037: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.66136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.66248: done with get_vars() 22690 1727204244.66258: done queuing things up, now waiting for results queue to drain 22690 1727204244.66260: results queue empty 22690 1727204244.66260: checking for any_errors_fatal 22690 1727204244.66262: done checking for any_errors_fatal 22690 1727204244.66263: checking for max_fail_percentage 22690 1727204244.66263: done checking for max_fail_percentage 22690 1727204244.66264: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.66264: done checking to see if all hosts have failed 22690 1727204244.66271: getting the remaining hosts for this loop 22690 1727204244.66272: done getting the remaining hosts for this loop 22690 1727204244.66274: getting the next task for host managed-node2 22690 1727204244.66277: done getting next task for host managed-node2 22690 1727204244.66278: ^ task is: TASK: meta (flush_handlers) 22690 1727204244.66279: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.66281: getting variables 22690 1727204244.66281: in VariableManager get_vars() 22690 1727204244.66287: Calling all_inventory to load vars for managed-node2 22690 1727204244.66288: Calling groups_inventory to load vars for managed-node2 22690 1727204244.66290: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.66293: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.66295: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.66297: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.66401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.66512: done with get_vars() 22690 1727204244.66519: done getting variables 22690 1727204244.66558: in VariableManager get_vars() 22690 1727204244.66566: Calling all_inventory to load vars for managed-node2 22690 1727204244.66568: Calling groups_inventory to load vars for managed-node2 22690 1727204244.66570: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.66574: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.66575: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.66577: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.66660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.66775: done with get_vars() 22690 1727204244.66783: done queuing things up, now waiting for results queue to drain 22690 1727204244.66785: results queue empty 22690 1727204244.66785: checking for any_errors_fatal 22690 1727204244.66786: done checking for any_errors_fatal 22690 1727204244.66787: checking for max_fail_percentage 22690 1727204244.66787: done checking for max_fail_percentage 22690 1727204244.66788: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.66788: done checking to see if all hosts have failed 22690 1727204244.66789: getting the remaining hosts for this loop 22690 1727204244.66790: done getting the remaining hosts for this loop 22690 1727204244.66791: getting the next task for host managed-node2 22690 1727204244.66793: done getting next task for host managed-node2 22690 1727204244.66794: ^ task is: None 22690 1727204244.66795: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.66796: done queuing things up, now waiting for results queue to drain 22690 1727204244.66797: results queue empty 22690 1727204244.66797: checking for any_errors_fatal 22690 1727204244.66797: done checking for any_errors_fatal 22690 1727204244.66798: checking for max_fail_percentage 22690 1727204244.66799: done checking for max_fail_percentage 22690 1727204244.66800: checking to see if all hosts have failed and the running result is not ok 22690 1727204244.66800: done checking to see if all hosts have failed 22690 1727204244.66801: getting the next task for host managed-node2 22690 1727204244.66803: done getting next task for host managed-node2 22690 1727204244.66803: ^ task is: None 22690 1727204244.66804: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.66844: in VariableManager get_vars() 22690 1727204244.66860: done with get_vars() 22690 1727204244.66864: in VariableManager get_vars() 22690 1727204244.66876: done with get_vars() 22690 1727204244.66880: variable 'omit' from source: magic vars 22690 1727204244.66902: in VariableManager get_vars() 22690 1727204244.66911: done with get_vars() 22690 1727204244.66929: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 22690 1727204244.67373: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204244.67394: getting the remaining hosts for this loop 22690 1727204244.67395: done getting the remaining hosts for this loop 22690 1727204244.67397: getting the next task for host managed-node2 22690 1727204244.67399: done getting next task for host managed-node2 22690 1727204244.67400: ^ task is: TASK: Gathering Facts 22690 1727204244.67401: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204244.67403: getting variables 22690 1727204244.67403: in VariableManager get_vars() 22690 1727204244.67411: Calling all_inventory to load vars for managed-node2 22690 1727204244.67412: Calling groups_inventory to load vars for managed-node2 22690 1727204244.67413: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204244.67419: Calling all_plugins_play to load vars for managed-node2 22690 1727204244.67422: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204244.67424: Calling groups_plugins_play to load vars for managed-node2 22690 1727204244.67508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204244.67623: done with get_vars() 22690 1727204244.67629: done getting variables 22690 1727204244.67662: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Tuesday 24 September 2024 14:57:24 -0400 (0:00:00.049) 0:00:11.960 ***** 22690 1727204244.67682: entering _queue_task() for managed-node2/gather_facts 22690 1727204244.67918: worker is 1 (out of 1 available) 22690 1727204244.67931: exiting _queue_task() for managed-node2/gather_facts 22690 1727204244.67945: done queuing things up, now waiting for results queue to drain 22690 1727204244.67946: waiting for pending results... 22690 1727204244.68121: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204244.68183: in run() - task 127b8e07-fff9-78bb-bf56-000000000237 22690 1727204244.68196: variable 'ansible_search_path' from source: unknown 22690 1727204244.68229: calling self._execute() 22690 1727204244.68295: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.68299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.68309: variable 'omit' from source: magic vars 22690 1727204244.68591: variable 'ansible_distribution_major_version' from source: facts 22690 1727204244.68604: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204244.68607: variable 'omit' from source: magic vars 22690 1727204244.68630: variable 'omit' from source: magic vars 22690 1727204244.68659: variable 'omit' from source: magic vars 22690 1727204244.68694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204244.68725: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204244.68743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204244.68759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.68771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204244.68795: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204244.68799: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.68803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.68878: Set connection var ansible_connection to ssh 22690 1727204244.68887: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204244.68894: Set connection var ansible_pipelining to False 22690 1727204244.68897: Set connection var ansible_shell_type to sh 22690 1727204244.68902: Set connection var ansible_shell_executable to /bin/sh 22690 1727204244.68909: Set connection var ansible_timeout to 10 22690 1727204244.68927: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.68930: variable 'ansible_connection' from source: unknown 22690 1727204244.68933: variable 'ansible_module_compression' from source: unknown 22690 1727204244.68937: variable 'ansible_shell_type' from source: unknown 22690 1727204244.68940: variable 'ansible_shell_executable' from source: unknown 22690 1727204244.68942: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204244.68945: variable 'ansible_pipelining' from source: unknown 22690 1727204244.68947: variable 'ansible_timeout' from source: unknown 22690 1727204244.68952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204244.69126: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204244.69134: variable 'omit' from source: magic vars 22690 1727204244.69138: starting attempt loop 22690 1727204244.69141: running the handler 22690 1727204244.69154: variable 'ansible_facts' from source: unknown 22690 1727204244.69173: _low_level_execute_command(): starting 22690 1727204244.69182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204244.69744: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.69748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204244.69753: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204244.69756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.69807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.69810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.69812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.69895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.71651: stdout chunk (state=3): >>>/root <<< 22690 1727204244.71756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.71825: stderr chunk (state=3): >>><<< 22690 1727204244.71828: stdout chunk (state=3): >>><<< 22690 1727204244.71853: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.71867: _low_level_execute_command(): starting 22690 1727204244.71874: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632 `" && echo ansible-tmp-1727204244.7185214-23463-53061716275632="` echo /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632 `" ) && sleep 0' 22690 1727204244.72380: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.72384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.72396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204244.72399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.72442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204244.72446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.72527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.74509: stdout chunk (state=3): >>>ansible-tmp-1727204244.7185214-23463-53061716275632=/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632 <<< 22690 1727204244.74640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.74674: stderr chunk (state=3): >>><<< 22690 1727204244.74678: stdout chunk (state=3): >>><<< 22690 1727204244.74694: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204244.7185214-23463-53061716275632=/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.74722: variable 'ansible_module_compression' from source: unknown 22690 1727204244.74767: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204244.74821: variable 'ansible_facts' from source: unknown 22690 1727204244.74962: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py 22690 1727204244.75089: Sending initial data 22690 1727204244.75092: Sent initial data (153 bytes) 22690 1727204244.75597: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.75604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.75607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.75609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.75665: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.75673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.75742: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.77355: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204244.77422: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204244.77495: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwfx8_eg8 /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py <<< 22690 1727204244.77499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py" <<< 22690 1727204244.77561: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwfx8_eg8" to remote "/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py" <<< 22690 1727204244.77568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py" <<< 22690 1727204244.78770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.78849: stderr chunk (state=3): >>><<< 22690 1727204244.78854: stdout chunk (state=3): >>><<< 22690 1727204244.78879: done transferring module to remote 22690 1727204244.78890: _low_level_execute_command(): starting 22690 1727204244.78895: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/ /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py && sleep 0' 22690 1727204244.79398: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204244.79401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.79404: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204244.79407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204244.79416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.79472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.79475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.79542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204244.81411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204244.81469: stderr chunk (state=3): >>><<< 22690 1727204244.81473: stdout chunk (state=3): >>><<< 22690 1727204244.81490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204244.81494: _low_level_execute_command(): starting 22690 1727204244.81497: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/AnsiballZ_setup.py && sleep 0' 22690 1727204244.82008: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.82012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.82015: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204244.82017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204244.82077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204244.82081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204244.82159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204245.49907: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.640625, "5m": 0.53173828125, "15m": 0.28515625}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3014, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 702, "free": 3014}, "nocache": {"free": 3446, "used": 270}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 591, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316695040, "block_size": 4096, "block_total": 64479564, "block_available": 61356615, "block_used": 3122949, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "25", "epoch": "1727204245", "epoch_int": "1727204245", "date": "2024-09-24", "time": "14:57:25", "iso8601_micro": "2024-09-24T18:57:25.494770Z", "iso8601": "2024-09-24T18:57:25Z", "iso8601_basic": "20240924T145725494770", "iso8601_basic_short": "20240924T145725", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204245.51910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204245.52020: stderr chunk (state=3): >>><<< 22690 1727204245.52024: stdout chunk (state=3): >>><<< 22690 1727204245.52173: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.640625, "5m": 0.53173828125, "15m": 0.28515625}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3014, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 702, "free": 3014}, "nocache": {"free": 3446, "used": 270}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 591, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316695040, "block_size": 4096, "block_total": 64479564, "block_available": 61356615, "block_used": 3122949, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_interfaces": ["lo", "peerlsr27", "lsr27", "eth0"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_service_mgr": "systemd", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "25", "epoch": "1727204245", "epoch_int": "1727204245", "date": "2024-09-24", "time": "14:57:25", "iso8601_micro": "2024-09-24T18:57:25.494770Z", "iso8601": "2024-09-24T18:57:25Z", "iso8601_basic": "20240924T145725494770", "iso8601_basic_short": "20240924T145725", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204245.52427: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204245.52458: _low_level_execute_command(): starting 22690 1727204245.52472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204244.7185214-23463-53061716275632/ > /dev/null 2>&1 && sleep 0' 22690 1727204245.53277: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204245.53296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204245.53311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204245.53395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204245.53434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204245.53451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204245.53477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204245.53589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204245.55597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204245.55624: stderr chunk (state=3): >>><<< 22690 1727204245.55627: stdout chunk (state=3): >>><<< 22690 1727204245.55644: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204245.55661: handler run complete 22690 1727204245.55814: variable 'ansible_facts' from source: unknown 22690 1727204245.56070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.56277: variable 'ansible_facts' from source: unknown 22690 1727204245.56373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.56513: attempt loop complete, returning result 22690 1727204245.56524: _execute() done 22690 1727204245.56533: dumping result to json 22690 1727204245.56570: done dumping result, returning 22690 1727204245.56584: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-000000000237] 22690 1727204245.56593: sending task result for task 127b8e07-fff9-78bb-bf56-000000000237 ok: [managed-node2] 22690 1727204245.57533: no more pending results, returning what we have 22690 1727204245.57536: results queue empty 22690 1727204245.57538: checking for any_errors_fatal 22690 1727204245.57539: done checking for any_errors_fatal 22690 1727204245.57540: checking for max_fail_percentage 22690 1727204245.57542: done checking for max_fail_percentage 22690 1727204245.57543: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.57544: done checking to see if all hosts have failed 22690 1727204245.57544: getting the remaining hosts for this loop 22690 1727204245.57546: done getting the remaining hosts for this loop 22690 1727204245.57550: getting the next task for host managed-node2 22690 1727204245.57555: done getting next task for host managed-node2 22690 1727204245.57557: ^ task is: TASK: meta (flush_handlers) 22690 1727204245.57559: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.57562: getting variables 22690 1727204245.57564: in VariableManager get_vars() 22690 1727204245.57597: Calling all_inventory to load vars for managed-node2 22690 1727204245.57601: Calling groups_inventory to load vars for managed-node2 22690 1727204245.57604: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.57787: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000237 22690 1727204245.57791: WORKER PROCESS EXITING 22690 1727204245.57803: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.57807: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.57811: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.58202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.58600: done with get_vars() 22690 1727204245.58612: done getting variables 22690 1727204245.58794: in VariableManager get_vars() 22690 1727204245.58809: Calling all_inventory to load vars for managed-node2 22690 1727204245.58811: Calling groups_inventory to load vars for managed-node2 22690 1727204245.58813: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.58818: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.58821: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.58823: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.59000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.59229: done with get_vars() 22690 1727204245.59244: done queuing things up, now waiting for results queue to drain 22690 1727204245.59246: results queue empty 22690 1727204245.59247: checking for any_errors_fatal 22690 1727204245.59251: done checking for any_errors_fatal 22690 1727204245.59252: checking for max_fail_percentage 22690 1727204245.59253: done checking for max_fail_percentage 22690 1727204245.59258: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.59259: done checking to see if all hosts have failed 22690 1727204245.59259: getting the remaining hosts for this loop 22690 1727204245.59260: done getting the remaining hosts for this loop 22690 1727204245.59263: getting the next task for host managed-node2 22690 1727204245.59270: done getting next task for host managed-node2 22690 1727204245.59273: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204245.59275: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.59285: getting variables 22690 1727204245.59287: in VariableManager get_vars() 22690 1727204245.59300: Calling all_inventory to load vars for managed-node2 22690 1727204245.59303: Calling groups_inventory to load vars for managed-node2 22690 1727204245.59305: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.59309: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.59312: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.59315: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.59468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.59714: done with get_vars() 22690 1727204245.59723: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.921) 0:00:12.881 ***** 22690 1727204245.59802: entering _queue_task() for managed-node2/include_tasks 22690 1727204245.60138: worker is 1 (out of 1 available) 22690 1727204245.60154: exiting _queue_task() for managed-node2/include_tasks 22690 1727204245.60172: done queuing things up, now waiting for results queue to drain 22690 1727204245.60174: waiting for pending results... 22690 1727204245.60403: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204245.60521: in run() - task 127b8e07-fff9-78bb-bf56-000000000019 22690 1727204245.60549: variable 'ansible_search_path' from source: unknown 22690 1727204245.60557: variable 'ansible_search_path' from source: unknown 22690 1727204245.60603: calling self._execute() 22690 1727204245.60701: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.60713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.60732: variable 'omit' from source: magic vars 22690 1727204245.61136: variable 'ansible_distribution_major_version' from source: facts 22690 1727204245.61154: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204245.61166: _execute() done 22690 1727204245.61175: dumping result to json 22690 1727204245.61187: done dumping result, returning 22690 1727204245.61199: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-78bb-bf56-000000000019] 22690 1727204245.61209: sending task result for task 127b8e07-fff9-78bb-bf56-000000000019 22690 1727204245.61413: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000019 22690 1727204245.61419: WORKER PROCESS EXITING 22690 1727204245.61464: no more pending results, returning what we have 22690 1727204245.61471: in VariableManager get_vars() 22690 1727204245.61521: Calling all_inventory to load vars for managed-node2 22690 1727204245.61524: Calling groups_inventory to load vars for managed-node2 22690 1727204245.61526: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.61540: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.61543: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.61546: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.61737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.61963: done with get_vars() 22690 1727204245.61974: variable 'ansible_search_path' from source: unknown 22690 1727204245.61975: variable 'ansible_search_path' from source: unknown 22690 1727204245.62007: we have included files to process 22690 1727204245.62008: generating all_blocks data 22690 1727204245.62010: done generating all_blocks data 22690 1727204245.62010: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204245.62011: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204245.62014: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204245.63034: done processing included file 22690 1727204245.63037: iterating over new_blocks loaded from include file 22690 1727204245.63038: in VariableManager get_vars() 22690 1727204245.63060: done with get_vars() 22690 1727204245.63061: filtering new block on tags 22690 1727204245.63080: done filtering new block on tags 22690 1727204245.63083: in VariableManager get_vars() 22690 1727204245.63102: done with get_vars() 22690 1727204245.63104: filtering new block on tags 22690 1727204245.63122: done filtering new block on tags 22690 1727204245.63125: in VariableManager get_vars() 22690 1727204245.63143: done with get_vars() 22690 1727204245.63144: filtering new block on tags 22690 1727204245.63161: done filtering new block on tags 22690 1727204245.63163: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22690 1727204245.63172: extending task lists for all hosts with included blocks 22690 1727204245.63567: done extending task lists 22690 1727204245.63569: done processing included files 22690 1727204245.63570: results queue empty 22690 1727204245.63570: checking for any_errors_fatal 22690 1727204245.63572: done checking for any_errors_fatal 22690 1727204245.63572: checking for max_fail_percentage 22690 1727204245.63573: done checking for max_fail_percentage 22690 1727204245.63574: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.63575: done checking to see if all hosts have failed 22690 1727204245.63576: getting the remaining hosts for this loop 22690 1727204245.63578: done getting the remaining hosts for this loop 22690 1727204245.63581: getting the next task for host managed-node2 22690 1727204245.63584: done getting next task for host managed-node2 22690 1727204245.63586: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204245.63588: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.63597: getting variables 22690 1727204245.63597: in VariableManager get_vars() 22690 1727204245.63611: Calling all_inventory to load vars for managed-node2 22690 1727204245.63613: Calling groups_inventory to load vars for managed-node2 22690 1727204245.63615: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.63620: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.63622: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.63624: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.63773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.64084: done with get_vars() 22690 1727204245.64095: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.046) 0:00:12.928 ***** 22690 1727204245.64480: entering _queue_task() for managed-node2/setup 22690 1727204245.65061: worker is 1 (out of 1 available) 22690 1727204245.65280: exiting _queue_task() for managed-node2/setup 22690 1727204245.65295: done queuing things up, now waiting for results queue to drain 22690 1727204245.65296: waiting for pending results... 22690 1727204245.65793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204245.66101: in run() - task 127b8e07-fff9-78bb-bf56-000000000279 22690 1727204245.66108: variable 'ansible_search_path' from source: unknown 22690 1727204245.66111: variable 'ansible_search_path' from source: unknown 22690 1727204245.66144: calling self._execute() 22690 1727204245.66536: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.66540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.66542: variable 'omit' from source: magic vars 22690 1727204245.67536: variable 'ansible_distribution_major_version' from source: facts 22690 1727204245.67539: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204245.67957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204245.72384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204245.72488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204245.72542: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204245.72584: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204245.72619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204245.72719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204245.72763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204245.72796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204245.72848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204245.72874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204245.72937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204245.72971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204245.73073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204245.73076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204245.73079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204245.73251: variable '__network_required_facts' from source: role '' defaults 22690 1727204245.73267: variable 'ansible_facts' from source: unknown 22690 1727204245.73376: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22690 1727204245.73384: when evaluation is False, skipping this task 22690 1727204245.73391: _execute() done 22690 1727204245.73403: dumping result to json 22690 1727204245.73410: done dumping result, returning 22690 1727204245.73425: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-78bb-bf56-000000000279] 22690 1727204245.73434: sending task result for task 127b8e07-fff9-78bb-bf56-000000000279 22690 1727204245.73660: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000279 22690 1727204245.73663: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204245.73712: no more pending results, returning what we have 22690 1727204245.73722: results queue empty 22690 1727204245.73724: checking for any_errors_fatal 22690 1727204245.73726: done checking for any_errors_fatal 22690 1727204245.73726: checking for max_fail_percentage 22690 1727204245.73728: done checking for max_fail_percentage 22690 1727204245.73729: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.73730: done checking to see if all hosts have failed 22690 1727204245.73731: getting the remaining hosts for this loop 22690 1727204245.73732: done getting the remaining hosts for this loop 22690 1727204245.73737: getting the next task for host managed-node2 22690 1727204245.73746: done getting next task for host managed-node2 22690 1727204245.73750: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204245.73753: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.73770: getting variables 22690 1727204245.73772: in VariableManager get_vars() 22690 1727204245.73815: Calling all_inventory to load vars for managed-node2 22690 1727204245.73820: Calling groups_inventory to load vars for managed-node2 22690 1727204245.73822: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.73950: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.73954: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.73959: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.74291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.74540: done with get_vars() 22690 1727204245.74554: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.101) 0:00:13.030 ***** 22690 1727204245.74673: entering _queue_task() for managed-node2/stat 22690 1727204245.75143: worker is 1 (out of 1 available) 22690 1727204245.75159: exiting _queue_task() for managed-node2/stat 22690 1727204245.75179: done queuing things up, now waiting for results queue to drain 22690 1727204245.75180: waiting for pending results... 22690 1727204245.75494: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204245.75576: in run() - task 127b8e07-fff9-78bb-bf56-00000000027b 22690 1727204245.75611: variable 'ansible_search_path' from source: unknown 22690 1727204245.75623: variable 'ansible_search_path' from source: unknown 22690 1727204245.75673: calling self._execute() 22690 1727204245.75790: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.75871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.75875: variable 'omit' from source: magic vars 22690 1727204245.76326: variable 'ansible_distribution_major_version' from source: facts 22690 1727204245.76350: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204245.76649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204245.76965: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204245.77032: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204245.77075: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204245.77128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204245.77271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204245.77274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204245.77308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204245.77355: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204245.77473: variable '__network_is_ostree' from source: set_fact 22690 1727204245.77487: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204245.77546: when evaluation is False, skipping this task 22690 1727204245.77551: _execute() done 22690 1727204245.77554: dumping result to json 22690 1727204245.77556: done dumping result, returning 22690 1727204245.77559: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-78bb-bf56-00000000027b] 22690 1727204245.77562: sending task result for task 127b8e07-fff9-78bb-bf56-00000000027b 22690 1727204245.77744: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000027b 22690 1727204245.77748: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204245.77809: no more pending results, returning what we have 22690 1727204245.77814: results queue empty 22690 1727204245.77815: checking for any_errors_fatal 22690 1727204245.77826: done checking for any_errors_fatal 22690 1727204245.77827: checking for max_fail_percentage 22690 1727204245.77829: done checking for max_fail_percentage 22690 1727204245.77830: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.77831: done checking to see if all hosts have failed 22690 1727204245.77832: getting the remaining hosts for this loop 22690 1727204245.77833: done getting the remaining hosts for this loop 22690 1727204245.77838: getting the next task for host managed-node2 22690 1727204245.77845: done getting next task for host managed-node2 22690 1727204245.77850: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204245.77853: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.77871: getting variables 22690 1727204245.77873: in VariableManager get_vars() 22690 1727204245.77920: Calling all_inventory to load vars for managed-node2 22690 1727204245.77923: Calling groups_inventory to load vars for managed-node2 22690 1727204245.77926: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.77939: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.77943: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.77947: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.78551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.79092: done with get_vars() 22690 1727204245.79105: done getting variables 22690 1727204245.79299: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.046) 0:00:13.076 ***** 22690 1727204245.79340: entering _queue_task() for managed-node2/set_fact 22690 1727204245.80100: worker is 1 (out of 1 available) 22690 1727204245.80115: exiting _queue_task() for managed-node2/set_fact 22690 1727204245.80133: done queuing things up, now waiting for results queue to drain 22690 1727204245.80135: waiting for pending results... 22690 1727204245.80561: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204245.80569: in run() - task 127b8e07-fff9-78bb-bf56-00000000027c 22690 1727204245.80573: variable 'ansible_search_path' from source: unknown 22690 1727204245.80576: variable 'ansible_search_path' from source: unknown 22690 1727204245.80600: calling self._execute() 22690 1727204245.80708: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.80721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.80737: variable 'omit' from source: magic vars 22690 1727204245.81162: variable 'ansible_distribution_major_version' from source: facts 22690 1727204245.81185: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204245.81381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204245.81687: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204245.81746: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204245.81855: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204245.81859: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204245.81981: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204245.82016: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204245.82080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204245.82111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204245.82217: variable '__network_is_ostree' from source: set_fact 22690 1727204245.82232: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204245.82239: when evaluation is False, skipping this task 22690 1727204245.82471: _execute() done 22690 1727204245.82475: dumping result to json 22690 1727204245.82478: done dumping result, returning 22690 1727204245.82481: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-78bb-bf56-00000000027c] 22690 1727204245.82484: sending task result for task 127b8e07-fff9-78bb-bf56-00000000027c 22690 1727204245.82559: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000027c 22690 1727204245.82562: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204245.82807: no more pending results, returning what we have 22690 1727204245.82810: results queue empty 22690 1727204245.82811: checking for any_errors_fatal 22690 1727204245.82819: done checking for any_errors_fatal 22690 1727204245.82820: checking for max_fail_percentage 22690 1727204245.82822: done checking for max_fail_percentage 22690 1727204245.82822: checking to see if all hosts have failed and the running result is not ok 22690 1727204245.82823: done checking to see if all hosts have failed 22690 1727204245.82824: getting the remaining hosts for this loop 22690 1727204245.82825: done getting the remaining hosts for this loop 22690 1727204245.82828: getting the next task for host managed-node2 22690 1727204245.82835: done getting next task for host managed-node2 22690 1727204245.82839: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204245.82841: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204245.82854: getting variables 22690 1727204245.82855: in VariableManager get_vars() 22690 1727204245.82892: Calling all_inventory to load vars for managed-node2 22690 1727204245.82895: Calling groups_inventory to load vars for managed-node2 22690 1727204245.82897: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204245.82906: Calling all_plugins_play to load vars for managed-node2 22690 1727204245.82909: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204245.82911: Calling groups_plugins_play to load vars for managed-node2 22690 1727204245.83210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204245.83905: done with get_vars() 22690 1727204245.83921: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:25 -0400 (0:00:00.046) 0:00:13.123 ***** 22690 1727204245.84027: entering _queue_task() for managed-node2/service_facts 22690 1727204245.84029: Creating lock for service_facts 22690 1727204245.84925: worker is 1 (out of 1 available) 22690 1727204245.84941: exiting _queue_task() for managed-node2/service_facts 22690 1727204245.84956: done queuing things up, now waiting for results queue to drain 22690 1727204245.84957: waiting for pending results... 22690 1727204245.85687: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204245.85694: in run() - task 127b8e07-fff9-78bb-bf56-00000000027e 22690 1727204245.86073: variable 'ansible_search_path' from source: unknown 22690 1727204245.86078: variable 'ansible_search_path' from source: unknown 22690 1727204245.86081: calling self._execute() 22690 1727204245.86084: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.86086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.86089: variable 'omit' from source: magic vars 22690 1727204245.86844: variable 'ansible_distribution_major_version' from source: facts 22690 1727204245.87272: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204245.87277: variable 'omit' from source: magic vars 22690 1727204245.87279: variable 'omit' from source: magic vars 22690 1727204245.87282: variable 'omit' from source: magic vars 22690 1727204245.87285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204245.87672: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204245.87676: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204245.87678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204245.87681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204245.87683: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204245.87685: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.87687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.87690: Set connection var ansible_connection to ssh 22690 1727204245.87882: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204245.87897: Set connection var ansible_pipelining to False 22690 1727204245.87904: Set connection var ansible_shell_type to sh 22690 1727204245.87914: Set connection var ansible_shell_executable to /bin/sh 22690 1727204245.87926: Set connection var ansible_timeout to 10 22690 1727204245.87954: variable 'ansible_shell_executable' from source: unknown 22690 1727204245.87962: variable 'ansible_connection' from source: unknown 22690 1727204245.87971: variable 'ansible_module_compression' from source: unknown 22690 1727204245.87978: variable 'ansible_shell_type' from source: unknown 22690 1727204245.87984: variable 'ansible_shell_executable' from source: unknown 22690 1727204245.87991: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204245.88000: variable 'ansible_pipelining' from source: unknown 22690 1727204245.88007: variable 'ansible_timeout' from source: unknown 22690 1727204245.88014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204245.88428: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204245.88448: variable 'omit' from source: magic vars 22690 1727204245.88672: starting attempt loop 22690 1727204245.88675: running the handler 22690 1727204245.88678: _low_level_execute_command(): starting 22690 1727204245.88680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204245.89894: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204245.90189: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204245.90214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204245.90485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204245.90594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204245.92359: stdout chunk (state=3): >>>/root <<< 22690 1727204245.92538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204245.92555: stdout chunk (state=3): >>><<< 22690 1727204245.92572: stderr chunk (state=3): >>><<< 22690 1727204245.92600: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204245.92872: _low_level_execute_command(): starting 22690 1727204245.92876: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969 `" && echo ansible-tmp-1727204245.927752-23503-128601152586969="` echo /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969 `" ) && sleep 0' 22690 1727204245.94088: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204245.94114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204245.94129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204245.94182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204245.94205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204245.94333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204245.94544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204245.96797: stdout chunk (state=3): >>>ansible-tmp-1727204245.927752-23503-128601152586969=/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969 <<< 22690 1727204245.96901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204245.97037: stderr chunk (state=3): >>><<< 22690 1727204245.97048: stdout chunk (state=3): >>><<< 22690 1727204245.97076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204245.927752-23503-128601152586969=/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204245.97172: variable 'ansible_module_compression' from source: unknown 22690 1727204245.97304: ANSIBALLZ: Using lock for service_facts 22690 1727204245.97371: ANSIBALLZ: Acquiring lock 22690 1727204245.97469: ANSIBALLZ: Lock acquired: 139846648795968 22690 1727204245.97473: ANSIBALLZ: Creating module 22690 1727204246.27798: ANSIBALLZ: Writing module into payload 22690 1727204246.28072: ANSIBALLZ: Writing module 22690 1727204246.28076: ANSIBALLZ: Renaming module 22690 1727204246.28078: ANSIBALLZ: Done creating module 22690 1727204246.28081: variable 'ansible_facts' from source: unknown 22690 1727204246.28083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py 22690 1727204246.28246: Sending initial data 22690 1727204246.28249: Sent initial data (161 bytes) 22690 1727204246.29088: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204246.29111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204246.29136: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204246.29144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204246.29247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204246.31092: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204246.31390: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpemuuqhgh" to remote "/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py" <<< 22690 1727204246.31395: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpemuuqhgh /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py <<< 22690 1727204246.32490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204246.32603: stderr chunk (state=3): >>><<< 22690 1727204246.32607: stdout chunk (state=3): >>><<< 22690 1727204246.32774: done transferring module to remote 22690 1727204246.32778: _low_level_execute_command(): starting 22690 1727204246.32781: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/ /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py && sleep 0' 22690 1727204246.33246: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204246.33255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204246.33270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204246.33285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204246.33296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204246.33303: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204246.33313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204246.33335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204246.33338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204246.33344: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204246.33352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204246.33361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204246.33445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204246.33454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204246.33457: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204246.33459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204246.33462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204246.33473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204246.33492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204246.33591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204246.35497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204246.35575: stderr chunk (state=3): >>><<< 22690 1727204246.35587: stdout chunk (state=3): >>><<< 22690 1727204246.35710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204246.35713: _low_level_execute_command(): starting 22690 1727204246.35722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/AnsiballZ_service_facts.py && sleep 0' 22690 1727204246.36336: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204246.36350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204246.36364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204246.36386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204246.36407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204246.36424: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204246.36527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204246.36551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204246.36670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204248.53718: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.<<< 22690 1727204248.53736: stdout chunk (state=3): >>>service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "sourc<<< 22690 1727204248.53740: stdout chunk (state=3): >>>e": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-<<< 22690 1727204248.53786: stdout chunk (state=3): >>>utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22690 1727204248.55467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204248.55472: stdout chunk (state=3): >>><<< 22690 1727204248.55475: stderr chunk (state=3): >>><<< 22690 1727204248.55504: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204248.56558: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204248.56562: _low_level_execute_command(): starting 22690 1727204248.56639: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204245.927752-23503-128601152586969/ > /dev/null 2>&1 && sleep 0' 22690 1727204248.57306: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204248.57340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204248.57381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204248.57400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204248.57413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204248.57509: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204248.57513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204248.57574: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204248.57649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204248.59665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204248.59672: stdout chunk (state=3): >>><<< 22690 1727204248.59674: stderr chunk (state=3): >>><<< 22690 1727204248.59873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204248.59877: handler run complete 22690 1727204248.59969: variable 'ansible_facts' from source: unknown 22690 1727204248.60169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204248.60805: variable 'ansible_facts' from source: unknown 22690 1727204248.60979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204248.61550: attempt loop complete, returning result 22690 1727204248.61556: _execute() done 22690 1727204248.61559: dumping result to json 22690 1727204248.61694: done dumping result, returning 22690 1727204248.61706: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-78bb-bf56-00000000027e] 22690 1727204248.61709: sending task result for task 127b8e07-fff9-78bb-bf56-00000000027e ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204248.63669: no more pending results, returning what we have 22690 1727204248.63672: results queue empty 22690 1727204248.63673: checking for any_errors_fatal 22690 1727204248.63685: done checking for any_errors_fatal 22690 1727204248.63686: checking for max_fail_percentage 22690 1727204248.63688: done checking for max_fail_percentage 22690 1727204248.63689: checking to see if all hosts have failed and the running result is not ok 22690 1727204248.63690: done checking to see if all hosts have failed 22690 1727204248.63691: getting the remaining hosts for this loop 22690 1727204248.63692: done getting the remaining hosts for this loop 22690 1727204248.63695: getting the next task for host managed-node2 22690 1727204248.63701: done getting next task for host managed-node2 22690 1727204248.63704: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204248.63706: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204248.63715: getting variables 22690 1727204248.63717: in VariableManager get_vars() 22690 1727204248.63745: Calling all_inventory to load vars for managed-node2 22690 1727204248.63747: Calling groups_inventory to load vars for managed-node2 22690 1727204248.63749: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204248.63758: Calling all_plugins_play to load vars for managed-node2 22690 1727204248.63760: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204248.63763: Calling groups_plugins_play to load vars for managed-node2 22690 1727204248.64162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204248.64688: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000027e 22690 1727204248.64692: WORKER PROCESS EXITING 22690 1727204248.64923: done with get_vars() 22690 1727204248.64944: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:28 -0400 (0:00:02.810) 0:00:15.934 ***** 22690 1727204248.65128: entering _queue_task() for managed-node2/package_facts 22690 1727204248.65130: Creating lock for package_facts 22690 1727204248.65702: worker is 1 (out of 1 available) 22690 1727204248.65716: exiting _queue_task() for managed-node2/package_facts 22690 1727204248.65728: done queuing things up, now waiting for results queue to drain 22690 1727204248.65730: waiting for pending results... 22690 1727204248.65892: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204248.66066: in run() - task 127b8e07-fff9-78bb-bf56-00000000027f 22690 1727204248.66074: variable 'ansible_search_path' from source: unknown 22690 1727204248.66077: variable 'ansible_search_path' from source: unknown 22690 1727204248.66099: calling self._execute() 22690 1727204248.66197: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204248.66210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204248.66224: variable 'omit' from source: magic vars 22690 1727204248.66620: variable 'ansible_distribution_major_version' from source: facts 22690 1727204248.66716: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204248.66720: variable 'omit' from source: magic vars 22690 1727204248.66722: variable 'omit' from source: magic vars 22690 1727204248.66754: variable 'omit' from source: magic vars 22690 1727204248.66806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204248.66853: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204248.66881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204248.66905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204248.66922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204248.66961: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204248.66971: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204248.66979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204248.67097: Set connection var ansible_connection to ssh 22690 1727204248.67130: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204248.67167: Set connection var ansible_pipelining to False 22690 1727204248.67179: Set connection var ansible_shell_type to sh 22690 1727204248.67182: Set connection var ansible_shell_executable to /bin/sh 22690 1727204248.67188: Set connection var ansible_timeout to 10 22690 1727204248.67244: variable 'ansible_shell_executable' from source: unknown 22690 1727204248.67270: variable 'ansible_connection' from source: unknown 22690 1727204248.67273: variable 'ansible_module_compression' from source: unknown 22690 1727204248.67275: variable 'ansible_shell_type' from source: unknown 22690 1727204248.67282: variable 'ansible_shell_executable' from source: unknown 22690 1727204248.67284: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204248.67286: variable 'ansible_pipelining' from source: unknown 22690 1727204248.67288: variable 'ansible_timeout' from source: unknown 22690 1727204248.67472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204248.67516: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204248.67534: variable 'omit' from source: magic vars 22690 1727204248.67545: starting attempt loop 22690 1727204248.67551: running the handler 22690 1727204248.67571: _low_level_execute_command(): starting 22690 1727204248.67588: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204248.68373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204248.68391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204248.68406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204248.68429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204248.68448: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204248.68558: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204248.68595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204248.68696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204248.70395: stdout chunk (state=3): >>>/root <<< 22690 1727204248.70541: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204248.70687: stderr chunk (state=3): >>><<< 22690 1727204248.70691: stdout chunk (state=3): >>><<< 22690 1727204248.70712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204248.70885: _low_level_execute_command(): starting 22690 1727204248.70891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233 `" && echo ansible-tmp-1727204248.7077672-23714-128757136228233="` echo /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233 `" ) && sleep 0' 22690 1727204248.72197: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204248.72318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204248.72336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204248.72517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204248.74415: stdout chunk (state=3): >>>ansible-tmp-1727204248.7077672-23714-128757136228233=/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233 <<< 22690 1727204248.74521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204248.74785: stderr chunk (state=3): >>><<< 22690 1727204248.74789: stdout chunk (state=3): >>><<< 22690 1727204248.74801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204248.7077672-23714-128757136228233=/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204248.75073: variable 'ansible_module_compression' from source: unknown 22690 1727204248.75110: ANSIBALLZ: Using lock for package_facts 22690 1727204248.75118: ANSIBALLZ: Acquiring lock 22690 1727204248.75234: ANSIBALLZ: Lock acquired: 139846648923776 22690 1727204248.75238: ANSIBALLZ: Creating module 22690 1727204249.19154: ANSIBALLZ: Writing module into payload 22690 1727204249.19312: ANSIBALLZ: Writing module 22690 1727204249.19346: ANSIBALLZ: Renaming module 22690 1727204249.19355: ANSIBALLZ: Done creating module 22690 1727204249.19393: variable 'ansible_facts' from source: unknown 22690 1727204249.19585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py 22690 1727204249.19838: Sending initial data 22690 1727204249.19841: Sent initial data (162 bytes) 22690 1727204249.20282: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204249.20297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204249.20309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204249.20374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204249.20379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204249.20382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204249.20456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204249.22219: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204249.22281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204249.22360: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpbslx3mk6 /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py <<< 22690 1727204249.22364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py" <<< 22690 1727204249.22436: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpbslx3mk6" to remote "/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py" <<< 22690 1727204249.24099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204249.24138: stderr chunk (state=3): >>><<< 22690 1727204249.24197: stdout chunk (state=3): >>><<< 22690 1727204249.24201: done transferring module to remote 22690 1727204249.24208: _low_level_execute_command(): starting 22690 1727204249.24221: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/ /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py && sleep 0' 22690 1727204249.24797: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204249.24803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204249.24831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204249.24834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204249.24838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204249.24897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204249.24908: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204249.24910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204249.24978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204249.26830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204249.27073: stderr chunk (state=3): >>><<< 22690 1727204249.27077: stdout chunk (state=3): >>><<< 22690 1727204249.27080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204249.27083: _low_level_execute_command(): starting 22690 1727204249.27085: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/AnsiballZ_package_facts.py && sleep 0' 22690 1727204249.27551: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204249.27560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204249.27575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204249.27590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204249.27603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204249.27610: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204249.27626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204249.27647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204249.27688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204249.27700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204249.27783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204249.90302: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 22690 1727204249.90397: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source"<<< 22690 1727204249.90983: stdout chunk (state=3): >>>: "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22690 1727204249.92474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204249.92479: stdout chunk (state=3): >>><<< 22690 1727204249.92483: stderr chunk (state=3): >>><<< 22690 1727204249.92712: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204250.02874: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204250.02881: _low_level_execute_command(): starting 22690 1727204250.02884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204248.7077672-23714-128757136228233/ > /dev/null 2>&1 && sleep 0' 22690 1727204250.03496: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204250.03784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204250.03803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204250.03819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204250.03986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204250.06058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204250.06062: stdout chunk (state=3): >>><<< 22690 1727204250.06070: stderr chunk (state=3): >>><<< 22690 1727204250.06087: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204250.06095: handler run complete 22690 1727204250.07573: variable 'ansible_facts' from source: unknown 22690 1727204250.08078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.10950: variable 'ansible_facts' from source: unknown 22690 1727204250.11675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.12770: attempt loop complete, returning result 22690 1727204250.12803: _execute() done 22690 1727204250.12814: dumping result to json 22690 1727204250.13101: done dumping result, returning 22690 1727204250.13111: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-78bb-bf56-00000000027f] 22690 1727204250.13114: sending task result for task 127b8e07-fff9-78bb-bf56-00000000027f 22690 1727204250.16307: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000027f 22690 1727204250.16311: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204250.16409: no more pending results, returning what we have 22690 1727204250.16412: results queue empty 22690 1727204250.16413: checking for any_errors_fatal 22690 1727204250.16419: done checking for any_errors_fatal 22690 1727204250.16420: checking for max_fail_percentage 22690 1727204250.16421: done checking for max_fail_percentage 22690 1727204250.16422: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.16423: done checking to see if all hosts have failed 22690 1727204250.16424: getting the remaining hosts for this loop 22690 1727204250.16425: done getting the remaining hosts for this loop 22690 1727204250.16429: getting the next task for host managed-node2 22690 1727204250.16435: done getting next task for host managed-node2 22690 1727204250.16438: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204250.16440: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.16449: getting variables 22690 1727204250.16450: in VariableManager get_vars() 22690 1727204250.16483: Calling all_inventory to load vars for managed-node2 22690 1727204250.16486: Calling groups_inventory to load vars for managed-node2 22690 1727204250.16489: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.16503: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.16506: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.16510: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.18055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.20487: done with get_vars() 22690 1727204250.20517: done getting variables 22690 1727204250.20588: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:30 -0400 (0:00:01.554) 0:00:17.489 ***** 22690 1727204250.20621: entering _queue_task() for managed-node2/debug 22690 1727204250.21200: worker is 1 (out of 1 available) 22690 1727204250.21210: exiting _queue_task() for managed-node2/debug 22690 1727204250.21222: done queuing things up, now waiting for results queue to drain 22690 1727204250.21224: waiting for pending results... 22690 1727204250.21355: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204250.21418: in run() - task 127b8e07-fff9-78bb-bf56-00000000001a 22690 1727204250.21445: variable 'ansible_search_path' from source: unknown 22690 1727204250.21456: variable 'ansible_search_path' from source: unknown 22690 1727204250.21504: calling self._execute() 22690 1727204250.21612: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.21625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.21639: variable 'omit' from source: magic vars 22690 1727204250.22108: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.22113: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.22116: variable 'omit' from source: magic vars 22690 1727204250.22129: variable 'omit' from source: magic vars 22690 1727204250.22243: variable 'network_provider' from source: set_fact 22690 1727204250.22268: variable 'omit' from source: magic vars 22690 1727204250.22321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204250.22373: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204250.22401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204250.22424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204250.22547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204250.22550: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204250.22553: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.22555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.22607: Set connection var ansible_connection to ssh 22690 1727204250.22623: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204250.22637: Set connection var ansible_pipelining to False 22690 1727204250.22644: Set connection var ansible_shell_type to sh 22690 1727204250.22657: Set connection var ansible_shell_executable to /bin/sh 22690 1727204250.22671: Set connection var ansible_timeout to 10 22690 1727204250.22701: variable 'ansible_shell_executable' from source: unknown 22690 1727204250.22710: variable 'ansible_connection' from source: unknown 22690 1727204250.22716: variable 'ansible_module_compression' from source: unknown 22690 1727204250.22723: variable 'ansible_shell_type' from source: unknown 22690 1727204250.22730: variable 'ansible_shell_executable' from source: unknown 22690 1727204250.22736: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.22744: variable 'ansible_pipelining' from source: unknown 22690 1727204250.22751: variable 'ansible_timeout' from source: unknown 22690 1727204250.22762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.22924: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204250.22941: variable 'omit' from source: magic vars 22690 1727204250.22951: starting attempt loop 22690 1727204250.22958: running the handler 22690 1727204250.23016: handler run complete 22690 1727204250.23070: attempt loop complete, returning result 22690 1727204250.23074: _execute() done 22690 1727204250.23076: dumping result to json 22690 1727204250.23079: done dumping result, returning 22690 1727204250.23081: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-78bb-bf56-00000000001a] 22690 1727204250.23086: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001a ok: [managed-node2] => {} MSG: Using network provider: nm 22690 1727204250.23263: no more pending results, returning what we have 22690 1727204250.23270: results queue empty 22690 1727204250.23271: checking for any_errors_fatal 22690 1727204250.23282: done checking for any_errors_fatal 22690 1727204250.23283: checking for max_fail_percentage 22690 1727204250.23285: done checking for max_fail_percentage 22690 1727204250.23287: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.23288: done checking to see if all hosts have failed 22690 1727204250.23289: getting the remaining hosts for this loop 22690 1727204250.23290: done getting the remaining hosts for this loop 22690 1727204250.23295: getting the next task for host managed-node2 22690 1727204250.23302: done getting next task for host managed-node2 22690 1727204250.23306: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204250.23309: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.23320: getting variables 22690 1727204250.23322: in VariableManager get_vars() 22690 1727204250.23364: Calling all_inventory to load vars for managed-node2 22690 1727204250.23575: Calling groups_inventory to load vars for managed-node2 22690 1727204250.23578: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.23591: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.23595: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.23599: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.24284: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001a 22690 1727204250.24289: WORKER PROCESS EXITING 22690 1727204250.25396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.27536: done with get_vars() 22690 1727204250.27583: done getting variables 22690 1727204250.27654: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.070) 0:00:17.560 ***** 22690 1727204250.27692: entering _queue_task() for managed-node2/fail 22690 1727204250.28052: worker is 1 (out of 1 available) 22690 1727204250.28222: exiting _queue_task() for managed-node2/fail 22690 1727204250.28236: done queuing things up, now waiting for results queue to drain 22690 1727204250.28237: waiting for pending results... 22690 1727204250.28387: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204250.28516: in run() - task 127b8e07-fff9-78bb-bf56-00000000001b 22690 1727204250.28541: variable 'ansible_search_path' from source: unknown 22690 1727204250.28550: variable 'ansible_search_path' from source: unknown 22690 1727204250.28601: calling self._execute() 22690 1727204250.28704: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.28718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.28734: variable 'omit' from source: magic vars 22690 1727204250.29149: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.29173: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.29311: variable 'network_state' from source: role '' defaults 22690 1727204250.29329: Evaluated conditional (network_state != {}): False 22690 1727204250.29338: when evaluation is False, skipping this task 22690 1727204250.29347: _execute() done 22690 1727204250.29355: dumping result to json 22690 1727204250.29363: done dumping result, returning 22690 1727204250.29377: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-78bb-bf56-00000000001b] 22690 1727204250.29389: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001b skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204250.29571: no more pending results, returning what we have 22690 1727204250.29576: results queue empty 22690 1727204250.29577: checking for any_errors_fatal 22690 1727204250.29586: done checking for any_errors_fatal 22690 1727204250.29587: checking for max_fail_percentage 22690 1727204250.29589: done checking for max_fail_percentage 22690 1727204250.29590: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.29591: done checking to see if all hosts have failed 22690 1727204250.29592: getting the remaining hosts for this loop 22690 1727204250.29593: done getting the remaining hosts for this loop 22690 1727204250.29599: getting the next task for host managed-node2 22690 1727204250.29606: done getting next task for host managed-node2 22690 1727204250.29611: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204250.29615: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.29633: getting variables 22690 1727204250.29635: in VariableManager get_vars() 22690 1727204250.29786: Calling all_inventory to load vars for managed-node2 22690 1727204250.29789: Calling groups_inventory to load vars for managed-node2 22690 1727204250.29791: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.29978: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.29982: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.29986: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.30584: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001b 22690 1727204250.30588: WORKER PROCESS EXITING 22690 1727204250.31845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.33992: done with get_vars() 22690 1727204250.34031: done getting variables 22690 1727204250.34103: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.064) 0:00:17.624 ***** 22690 1727204250.34137: entering _queue_task() for managed-node2/fail 22690 1727204250.34499: worker is 1 (out of 1 available) 22690 1727204250.34516: exiting _queue_task() for managed-node2/fail 22690 1727204250.34529: done queuing things up, now waiting for results queue to drain 22690 1727204250.34530: waiting for pending results... 22690 1727204250.34832: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204250.34957: in run() - task 127b8e07-fff9-78bb-bf56-00000000001c 22690 1727204250.34984: variable 'ansible_search_path' from source: unknown 22690 1727204250.34998: variable 'ansible_search_path' from source: unknown 22690 1727204250.35044: calling self._execute() 22690 1727204250.35150: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.35164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.35183: variable 'omit' from source: magic vars 22690 1727204250.35605: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.35626: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.35776: variable 'network_state' from source: role '' defaults 22690 1727204250.35793: Evaluated conditional (network_state != {}): False 22690 1727204250.35802: when evaluation is False, skipping this task 22690 1727204250.35810: _execute() done 22690 1727204250.35818: dumping result to json 22690 1727204250.35827: done dumping result, returning 22690 1727204250.35839: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-78bb-bf56-00000000001c] 22690 1727204250.35850: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001c 22690 1727204250.36071: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001c 22690 1727204250.36075: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204250.36131: no more pending results, returning what we have 22690 1727204250.36137: results queue empty 22690 1727204250.36138: checking for any_errors_fatal 22690 1727204250.36148: done checking for any_errors_fatal 22690 1727204250.36149: checking for max_fail_percentage 22690 1727204250.36152: done checking for max_fail_percentage 22690 1727204250.36153: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.36154: done checking to see if all hosts have failed 22690 1727204250.36154: getting the remaining hosts for this loop 22690 1727204250.36156: done getting the remaining hosts for this loop 22690 1727204250.36160: getting the next task for host managed-node2 22690 1727204250.36170: done getting next task for host managed-node2 22690 1727204250.36174: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204250.36177: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.36193: getting variables 22690 1727204250.36195: in VariableManager get_vars() 22690 1727204250.36240: Calling all_inventory to load vars for managed-node2 22690 1727204250.36243: Calling groups_inventory to load vars for managed-node2 22690 1727204250.36245: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.36261: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.36264: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.36474: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.38146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.40426: done with get_vars() 22690 1727204250.40462: done getting variables 22690 1727204250.40523: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.064) 0:00:17.688 ***** 22690 1727204250.40552: entering _queue_task() for managed-node2/fail 22690 1727204250.40910: worker is 1 (out of 1 available) 22690 1727204250.40927: exiting _queue_task() for managed-node2/fail 22690 1727204250.40942: done queuing things up, now waiting for results queue to drain 22690 1727204250.40944: waiting for pending results... 22690 1727204250.41264: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204250.41391: in run() - task 127b8e07-fff9-78bb-bf56-00000000001d 22690 1727204250.41572: variable 'ansible_search_path' from source: unknown 22690 1727204250.41575: variable 'ansible_search_path' from source: unknown 22690 1727204250.41579: calling self._execute() 22690 1727204250.41582: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.41589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.41608: variable 'omit' from source: magic vars 22690 1727204250.42040: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.42062: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.42271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204250.44793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204250.44893: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204250.44938: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204250.44987: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204250.45021: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204250.45119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.45156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.45194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.45246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.45268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.45387: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.45571: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22690 1727204250.45574: variable 'ansible_distribution' from source: facts 22690 1727204250.45577: variable '__network_rh_distros' from source: role '' defaults 22690 1727204250.45579: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22690 1727204250.45581: when evaluation is False, skipping this task 22690 1727204250.45583: _execute() done 22690 1727204250.45586: dumping result to json 22690 1727204250.45588: done dumping result, returning 22690 1727204250.45602: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-78bb-bf56-00000000001d] 22690 1727204250.45611: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22690 1727204250.45763: no more pending results, returning what we have 22690 1727204250.45769: results queue empty 22690 1727204250.45771: checking for any_errors_fatal 22690 1727204250.45776: done checking for any_errors_fatal 22690 1727204250.45777: checking for max_fail_percentage 22690 1727204250.45779: done checking for max_fail_percentage 22690 1727204250.45780: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.45781: done checking to see if all hosts have failed 22690 1727204250.45782: getting the remaining hosts for this loop 22690 1727204250.45783: done getting the remaining hosts for this loop 22690 1727204250.45788: getting the next task for host managed-node2 22690 1727204250.45794: done getting next task for host managed-node2 22690 1727204250.45798: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204250.45801: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.45815: getting variables 22690 1727204250.45817: in VariableManager get_vars() 22690 1727204250.45860: Calling all_inventory to load vars for managed-node2 22690 1727204250.45863: Calling groups_inventory to load vars for managed-node2 22690 1727204250.45866: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.45879: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.45882: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.45885: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.46684: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001d 22690 1727204250.46687: WORKER PROCESS EXITING 22690 1727204250.47872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.49935: done with get_vars() 22690 1727204250.49977: done getting variables 22690 1727204250.50095: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.095) 0:00:17.784 ***** 22690 1727204250.50129: entering _queue_task() for managed-node2/dnf 22690 1727204250.50498: worker is 1 (out of 1 available) 22690 1727204250.50513: exiting _queue_task() for managed-node2/dnf 22690 1727204250.50527: done queuing things up, now waiting for results queue to drain 22690 1727204250.50528: waiting for pending results... 22690 1727204250.50849: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204250.51036: in run() - task 127b8e07-fff9-78bb-bf56-00000000001e 22690 1727204250.51062: variable 'ansible_search_path' from source: unknown 22690 1727204250.51074: variable 'ansible_search_path' from source: unknown 22690 1727204250.51123: calling self._execute() 22690 1727204250.51237: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.51258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.51359: variable 'omit' from source: magic vars 22690 1727204250.51726: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.51748: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.51906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204250.53874: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204250.54199: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204250.54324: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204250.54328: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204250.54335: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204250.54439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.54482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.54515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.54600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.54673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.54889: variable 'ansible_distribution' from source: facts 22690 1727204250.54892: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.54899: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22690 1727204250.55015: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204250.55118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.55139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.55164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.55196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.55212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.55248: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.55265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.55284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.55315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.55329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.55359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.55378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.55396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.55429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.55440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.55552: variable 'network_connections' from source: play vars 22690 1727204250.55562: variable 'interface' from source: set_fact 22690 1727204250.55620: variable 'interface' from source: set_fact 22690 1727204250.55630: variable 'interface' from source: set_fact 22690 1727204250.55679: variable 'interface' from source: set_fact 22690 1727204250.55735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204250.55868: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204250.55897: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204250.55923: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204250.55947: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204250.55986: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204250.56003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204250.56027: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.56046: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204250.56100: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204250.60418: variable 'network_connections' from source: play vars 22690 1727204250.60424: variable 'interface' from source: set_fact 22690 1727204250.60481: variable 'interface' from source: set_fact 22690 1727204250.60486: variable 'interface' from source: set_fact 22690 1727204250.60532: variable 'interface' from source: set_fact 22690 1727204250.60564: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204250.60567: when evaluation is False, skipping this task 22690 1727204250.60570: _execute() done 22690 1727204250.60574: dumping result to json 22690 1727204250.60578: done dumping result, returning 22690 1727204250.60586: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-00000000001e] 22690 1727204250.60589: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001e 22690 1727204250.60689: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001e 22690 1727204250.60692: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204250.60739: no more pending results, returning what we have 22690 1727204250.60743: results queue empty 22690 1727204250.60743: checking for any_errors_fatal 22690 1727204250.60751: done checking for any_errors_fatal 22690 1727204250.60751: checking for max_fail_percentage 22690 1727204250.60753: done checking for max_fail_percentage 22690 1727204250.60754: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.60755: done checking to see if all hosts have failed 22690 1727204250.60755: getting the remaining hosts for this loop 22690 1727204250.60757: done getting the remaining hosts for this loop 22690 1727204250.60760: getting the next task for host managed-node2 22690 1727204250.60768: done getting next task for host managed-node2 22690 1727204250.60772: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204250.60823: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.60840: getting variables 22690 1727204250.60841: in VariableManager get_vars() 22690 1727204250.60922: Calling all_inventory to load vars for managed-node2 22690 1727204250.60925: Calling groups_inventory to load vars for managed-node2 22690 1727204250.60927: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.60981: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.60984: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.60988: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.66239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.67406: done with get_vars() 22690 1727204250.67439: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204250.67493: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.173) 0:00:17.958 ***** 22690 1727204250.67513: entering _queue_task() for managed-node2/yum 22690 1727204250.67514: Creating lock for yum 22690 1727204250.67798: worker is 1 (out of 1 available) 22690 1727204250.67812: exiting _queue_task() for managed-node2/yum 22690 1727204250.67829: done queuing things up, now waiting for results queue to drain 22690 1727204250.67831: waiting for pending results... 22690 1727204250.68028: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204250.68109: in run() - task 127b8e07-fff9-78bb-bf56-00000000001f 22690 1727204250.68120: variable 'ansible_search_path' from source: unknown 22690 1727204250.68123: variable 'ansible_search_path' from source: unknown 22690 1727204250.68155: calling self._execute() 22690 1727204250.68236: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.68243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.68252: variable 'omit' from source: magic vars 22690 1727204250.68560: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.68573: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.68712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204250.70425: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204250.70485: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204250.70520: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204250.70547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204250.70570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204250.70644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.70668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.70691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.70723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.70734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.70815: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.70829: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22690 1727204250.70834: when evaluation is False, skipping this task 22690 1727204250.70837: _execute() done 22690 1727204250.70840: dumping result to json 22690 1727204250.70842: done dumping result, returning 22690 1727204250.70855: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-00000000001f] 22690 1727204250.70859: sending task result for task 127b8e07-fff9-78bb-bf56-00000000001f 22690 1727204250.70964: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000001f 22690 1727204250.70971: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22690 1727204250.71038: no more pending results, returning what we have 22690 1727204250.71042: results queue empty 22690 1727204250.71043: checking for any_errors_fatal 22690 1727204250.71053: done checking for any_errors_fatal 22690 1727204250.71054: checking for max_fail_percentage 22690 1727204250.71055: done checking for max_fail_percentage 22690 1727204250.71056: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.71057: done checking to see if all hosts have failed 22690 1727204250.71058: getting the remaining hosts for this loop 22690 1727204250.71059: done getting the remaining hosts for this loop 22690 1727204250.71063: getting the next task for host managed-node2 22690 1727204250.71071: done getting next task for host managed-node2 22690 1727204250.71075: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204250.71078: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.71093: getting variables 22690 1727204250.71095: in VariableManager get_vars() 22690 1727204250.71133: Calling all_inventory to load vars for managed-node2 22690 1727204250.71136: Calling groups_inventory to load vars for managed-node2 22690 1727204250.71138: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.71148: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.71151: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.71153: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.72177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.73498: done with get_vars() 22690 1727204250.73522: done getting variables 22690 1727204250.73577: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.060) 0:00:18.019 ***** 22690 1727204250.73602: entering _queue_task() for managed-node2/fail 22690 1727204250.73879: worker is 1 (out of 1 available) 22690 1727204250.73893: exiting _queue_task() for managed-node2/fail 22690 1727204250.73907: done queuing things up, now waiting for results queue to drain 22690 1727204250.73909: waiting for pending results... 22690 1727204250.74104: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204250.74186: in run() - task 127b8e07-fff9-78bb-bf56-000000000020 22690 1727204250.74198: variable 'ansible_search_path' from source: unknown 22690 1727204250.74201: variable 'ansible_search_path' from source: unknown 22690 1727204250.74239: calling self._execute() 22690 1727204250.74320: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.74324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.74333: variable 'omit' from source: magic vars 22690 1727204250.74644: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.74654: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.74745: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204250.74900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204250.76574: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204250.76610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204250.76641: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204250.76675: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204250.76697: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204250.76767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.76791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.76812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.76842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.76857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.76898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.76914: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.76934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.76962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.76979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.77007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.77024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.77043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.77072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.77086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.77219: variable 'network_connections' from source: play vars 22690 1727204250.77229: variable 'interface' from source: set_fact 22690 1727204250.77294: variable 'interface' from source: set_fact 22690 1727204250.77305: variable 'interface' from source: set_fact 22690 1727204250.77354: variable 'interface' from source: set_fact 22690 1727204250.77413: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204250.77559: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204250.77590: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204250.77619: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204250.77643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204250.77681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204250.77698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204250.77719: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.77739: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204250.77789: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204250.77967: variable 'network_connections' from source: play vars 22690 1727204250.77974: variable 'interface' from source: set_fact 22690 1727204250.78024: variable 'interface' from source: set_fact 22690 1727204250.78030: variable 'interface' from source: set_fact 22690 1727204250.78079: variable 'interface' from source: set_fact 22690 1727204250.78105: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204250.78108: when evaluation is False, skipping this task 22690 1727204250.78111: _execute() done 22690 1727204250.78114: dumping result to json 22690 1727204250.78121: done dumping result, returning 22690 1727204250.78127: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000020] 22690 1727204250.78138: sending task result for task 127b8e07-fff9-78bb-bf56-000000000020 22690 1727204250.78233: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000020 22690 1727204250.78235: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204250.78315: no more pending results, returning what we have 22690 1727204250.78322: results queue empty 22690 1727204250.78323: checking for any_errors_fatal 22690 1727204250.78332: done checking for any_errors_fatal 22690 1727204250.78333: checking for max_fail_percentage 22690 1727204250.78335: done checking for max_fail_percentage 22690 1727204250.78336: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.78337: done checking to see if all hosts have failed 22690 1727204250.78337: getting the remaining hosts for this loop 22690 1727204250.78339: done getting the remaining hosts for this loop 22690 1727204250.78343: getting the next task for host managed-node2 22690 1727204250.78355: done getting next task for host managed-node2 22690 1727204250.78360: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22690 1727204250.78362: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.78378: getting variables 22690 1727204250.78379: in VariableManager get_vars() 22690 1727204250.78418: Calling all_inventory to load vars for managed-node2 22690 1727204250.78421: Calling groups_inventory to load vars for managed-node2 22690 1727204250.78423: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.78433: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.78435: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.78438: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.79452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.80643: done with get_vars() 22690 1727204250.80673: done getting variables 22690 1727204250.80725: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.071) 0:00:18.090 ***** 22690 1727204250.80752: entering _queue_task() for managed-node2/package 22690 1727204250.81032: worker is 1 (out of 1 available) 22690 1727204250.81046: exiting _queue_task() for managed-node2/package 22690 1727204250.81059: done queuing things up, now waiting for results queue to drain 22690 1727204250.81060: waiting for pending results... 22690 1727204250.81247: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22690 1727204250.81327: in run() - task 127b8e07-fff9-78bb-bf56-000000000021 22690 1727204250.81356: variable 'ansible_search_path' from source: unknown 22690 1727204250.81360: variable 'ansible_search_path' from source: unknown 22690 1727204250.81401: calling self._execute() 22690 1727204250.81489: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.81495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.81504: variable 'omit' from source: magic vars 22690 1727204250.81842: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.81854: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.82006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204250.82223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204250.82262: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204250.82292: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204250.82355: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204250.82446: variable 'network_packages' from source: role '' defaults 22690 1727204250.82530: variable '__network_provider_setup' from source: role '' defaults 22690 1727204250.82540: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204250.82597: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204250.82605: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204250.82653: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204250.82786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204250.84604: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204250.84658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204250.84687: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204250.84713: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204250.84734: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204250.84803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.84825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.84843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.84877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.84889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.84924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.84941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.84958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.84991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.85003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.85160: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204250.85260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.85279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.85297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.85329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.85340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.85407: variable 'ansible_python' from source: facts 22690 1727204250.85430: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204250.85497: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204250.85563: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204250.85870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.85874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.85877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.85879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.85881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.85883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204250.85894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204250.85899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.85948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204250.85970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204250.86139: variable 'network_connections' from source: play vars 22690 1727204250.86152: variable 'interface' from source: set_fact 22690 1727204250.86270: variable 'interface' from source: set_fact 22690 1727204250.86286: variable 'interface' from source: set_fact 22690 1727204250.86393: variable 'interface' from source: set_fact 22690 1727204250.86479: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204250.86510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204250.86550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204250.86590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204250.86644: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204250.86948: variable 'network_connections' from source: play vars 22690 1727204250.86952: variable 'interface' from source: set_fact 22690 1727204250.87031: variable 'interface' from source: set_fact 22690 1727204250.87039: variable 'interface' from source: set_fact 22690 1727204250.87118: variable 'interface' from source: set_fact 22690 1727204250.87161: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204250.87226: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204250.87440: variable 'network_connections' from source: play vars 22690 1727204250.87444: variable 'interface' from source: set_fact 22690 1727204250.87493: variable 'interface' from source: set_fact 22690 1727204250.87499: variable 'interface' from source: set_fact 22690 1727204250.87553: variable 'interface' from source: set_fact 22690 1727204250.87574: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204250.87635: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204250.87847: variable 'network_connections' from source: play vars 22690 1727204250.87851: variable 'interface' from source: set_fact 22690 1727204250.87902: variable 'interface' from source: set_fact 22690 1727204250.87908: variable 'interface' from source: set_fact 22690 1727204250.87959: variable 'interface' from source: set_fact 22690 1727204250.88011: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204250.88058: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204250.88061: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204250.88110: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204250.88261: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204250.88590: variable 'network_connections' from source: play vars 22690 1727204250.88594: variable 'interface' from source: set_fact 22690 1727204250.88645: variable 'interface' from source: set_fact 22690 1727204250.88650: variable 'interface' from source: set_fact 22690 1727204250.88696: variable 'interface' from source: set_fact 22690 1727204250.88704: variable 'ansible_distribution' from source: facts 22690 1727204250.88707: variable '__network_rh_distros' from source: role '' defaults 22690 1727204250.88716: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.88737: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204250.88856: variable 'ansible_distribution' from source: facts 22690 1727204250.88859: variable '__network_rh_distros' from source: role '' defaults 22690 1727204250.88864: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.88872: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204250.89025: variable 'ansible_distribution' from source: facts 22690 1727204250.89271: variable '__network_rh_distros' from source: role '' defaults 22690 1727204250.89274: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.89277: variable 'network_provider' from source: set_fact 22690 1727204250.89279: variable 'ansible_facts' from source: unknown 22690 1727204250.89994: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22690 1727204250.90005: when evaluation is False, skipping this task 22690 1727204250.90012: _execute() done 22690 1727204250.90020: dumping result to json 22690 1727204250.90027: done dumping result, returning 22690 1727204250.90038: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-78bb-bf56-000000000021] 22690 1727204250.90046: sending task result for task 127b8e07-fff9-78bb-bf56-000000000021 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22690 1727204250.90229: no more pending results, returning what we have 22690 1727204250.90234: results queue empty 22690 1727204250.90234: checking for any_errors_fatal 22690 1727204250.90241: done checking for any_errors_fatal 22690 1727204250.90242: checking for max_fail_percentage 22690 1727204250.90244: done checking for max_fail_percentage 22690 1727204250.90245: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.90245: done checking to see if all hosts have failed 22690 1727204250.90246: getting the remaining hosts for this loop 22690 1727204250.90247: done getting the remaining hosts for this loop 22690 1727204250.90251: getting the next task for host managed-node2 22690 1727204250.90258: done getting next task for host managed-node2 22690 1727204250.90262: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204250.90264: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.90326: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000021 22690 1727204250.90329: WORKER PROCESS EXITING 22690 1727204250.90338: getting variables 22690 1727204250.90340: in VariableManager get_vars() 22690 1727204250.90382: Calling all_inventory to load vars for managed-node2 22690 1727204250.90385: Calling groups_inventory to load vars for managed-node2 22690 1727204250.90387: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.90405: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.90408: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.90411: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.92429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204250.94623: done with get_vars() 22690 1727204250.94661: done getting variables 22690 1727204250.94725: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:30 -0400 (0:00:00.140) 0:00:18.230 ***** 22690 1727204250.94756: entering _queue_task() for managed-node2/package 22690 1727204250.95303: worker is 1 (out of 1 available) 22690 1727204250.95315: exiting _queue_task() for managed-node2/package 22690 1727204250.95330: done queuing things up, now waiting for results queue to drain 22690 1727204250.95332: waiting for pending results... 22690 1727204250.95491: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204250.95629: in run() - task 127b8e07-fff9-78bb-bf56-000000000022 22690 1727204250.95653: variable 'ansible_search_path' from source: unknown 22690 1727204250.95664: variable 'ansible_search_path' from source: unknown 22690 1727204250.95744: calling self._execute() 22690 1727204250.96111: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204250.96118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204250.96121: variable 'omit' from source: magic vars 22690 1727204250.96878: variable 'ansible_distribution_major_version' from source: facts 22690 1727204250.97073: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204250.97215: variable 'network_state' from source: role '' defaults 22690 1727204250.97286: Evaluated conditional (network_state != {}): False 22690 1727204250.97294: when evaluation is False, skipping this task 22690 1727204250.97341: _execute() done 22690 1727204250.97350: dumping result to json 22690 1727204250.97357: done dumping result, returning 22690 1727204250.97371: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000022] 22690 1727204250.97382: sending task result for task 127b8e07-fff9-78bb-bf56-000000000022 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204250.97662: no more pending results, returning what we have 22690 1727204250.97671: results queue empty 22690 1727204250.97672: checking for any_errors_fatal 22690 1727204250.97682: done checking for any_errors_fatal 22690 1727204250.97683: checking for max_fail_percentage 22690 1727204250.97685: done checking for max_fail_percentage 22690 1727204250.97687: checking to see if all hosts have failed and the running result is not ok 22690 1727204250.97688: done checking to see if all hosts have failed 22690 1727204250.97688: getting the remaining hosts for this loop 22690 1727204250.97690: done getting the remaining hosts for this loop 22690 1727204250.97695: getting the next task for host managed-node2 22690 1727204250.97702: done getting next task for host managed-node2 22690 1727204250.97707: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204250.97710: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204250.97730: getting variables 22690 1727204250.97732: in VariableManager get_vars() 22690 1727204250.97883: Calling all_inventory to load vars for managed-node2 22690 1727204250.97886: Calling groups_inventory to load vars for managed-node2 22690 1727204250.97889: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204250.97904: Calling all_plugins_play to load vars for managed-node2 22690 1727204250.97908: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204250.97911: Calling groups_plugins_play to load vars for managed-node2 22690 1727204250.98686: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000022 22690 1727204250.98691: WORKER PROCESS EXITING 22690 1727204251.01735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204251.05709: done with get_vars() 22690 1727204251.05740: done getting variables 22690 1727204251.05818: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.110) 0:00:18.341 ***** 22690 1727204251.05843: entering _queue_task() for managed-node2/package 22690 1727204251.06138: worker is 1 (out of 1 available) 22690 1727204251.06154: exiting _queue_task() for managed-node2/package 22690 1727204251.06170: done queuing things up, now waiting for results queue to drain 22690 1727204251.06172: waiting for pending results... 22690 1727204251.06358: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204251.06433: in run() - task 127b8e07-fff9-78bb-bf56-000000000023 22690 1727204251.06446: variable 'ansible_search_path' from source: unknown 22690 1727204251.06450: variable 'ansible_search_path' from source: unknown 22690 1727204251.06484: calling self._execute() 22690 1727204251.06564: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204251.06570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204251.06580: variable 'omit' from source: magic vars 22690 1727204251.06883: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.06895: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204251.06989: variable 'network_state' from source: role '' defaults 22690 1727204251.06997: Evaluated conditional (network_state != {}): False 22690 1727204251.07001: when evaluation is False, skipping this task 22690 1727204251.07004: _execute() done 22690 1727204251.07007: dumping result to json 22690 1727204251.07009: done dumping result, returning 22690 1727204251.07020: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000023] 22690 1727204251.07023: sending task result for task 127b8e07-fff9-78bb-bf56-000000000023 22690 1727204251.07123: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000023 22690 1727204251.07127: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204251.07178: no more pending results, returning what we have 22690 1727204251.07183: results queue empty 22690 1727204251.07183: checking for any_errors_fatal 22690 1727204251.07192: done checking for any_errors_fatal 22690 1727204251.07193: checking for max_fail_percentage 22690 1727204251.07196: done checking for max_fail_percentage 22690 1727204251.07197: checking to see if all hosts have failed and the running result is not ok 22690 1727204251.07198: done checking to see if all hosts have failed 22690 1727204251.07199: getting the remaining hosts for this loop 22690 1727204251.07201: done getting the remaining hosts for this loop 22690 1727204251.07204: getting the next task for host managed-node2 22690 1727204251.07211: done getting next task for host managed-node2 22690 1727204251.07218: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204251.07220: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204251.07238: getting variables 22690 1727204251.07239: in VariableManager get_vars() 22690 1727204251.07277: Calling all_inventory to load vars for managed-node2 22690 1727204251.07279: Calling groups_inventory to load vars for managed-node2 22690 1727204251.07281: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204251.07291: Calling all_plugins_play to load vars for managed-node2 22690 1727204251.07294: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204251.07296: Calling groups_plugins_play to load vars for managed-node2 22690 1727204251.08702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204251.10020: done with get_vars() 22690 1727204251.10051: done getting variables 22690 1727204251.10137: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.043) 0:00:18.385 ***** 22690 1727204251.10161: entering _queue_task() for managed-node2/service 22690 1727204251.10162: Creating lock for service 22690 1727204251.10440: worker is 1 (out of 1 available) 22690 1727204251.10454: exiting _queue_task() for managed-node2/service 22690 1727204251.10469: done queuing things up, now waiting for results queue to drain 22690 1727204251.10471: waiting for pending results... 22690 1727204251.10650: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204251.10723: in run() - task 127b8e07-fff9-78bb-bf56-000000000024 22690 1727204251.10736: variable 'ansible_search_path' from source: unknown 22690 1727204251.10740: variable 'ansible_search_path' from source: unknown 22690 1727204251.10773: calling self._execute() 22690 1727204251.10855: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204251.10860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204251.10870: variable 'omit' from source: magic vars 22690 1727204251.11191: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.11198: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204251.11470: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204251.11597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204251.13724: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204251.14060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204251.14091: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204251.14122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204251.14144: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204251.14219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.14243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.14261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.14292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.14303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.14344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.14361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.14381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.14409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.14421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.14457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.14475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.14493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.14521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.14531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.14692: variable 'network_connections' from source: play vars 22690 1727204251.14695: variable 'interface' from source: set_fact 22690 1727204251.14751: variable 'interface' from source: set_fact 22690 1727204251.14767: variable 'interface' from source: set_fact 22690 1727204251.14827: variable 'interface' from source: set_fact 22690 1727204251.14907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204251.15271: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204251.15275: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204251.15277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204251.15280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204251.15283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204251.15285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204251.15287: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.15312: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204251.15375: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204251.15632: variable 'network_connections' from source: play vars 22690 1727204251.15637: variable 'interface' from source: set_fact 22690 1727204251.15696: variable 'interface' from source: set_fact 22690 1727204251.15703: variable 'interface' from source: set_fact 22690 1727204251.15761: variable 'interface' from source: set_fact 22690 1727204251.15788: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204251.15792: when evaluation is False, skipping this task 22690 1727204251.15795: _execute() done 22690 1727204251.15797: dumping result to json 22690 1727204251.15802: done dumping result, returning 22690 1727204251.15809: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000024] 22690 1727204251.15824: sending task result for task 127b8e07-fff9-78bb-bf56-000000000024 22690 1727204251.15922: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000024 22690 1727204251.15925: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204251.16004: no more pending results, returning what we have 22690 1727204251.16008: results queue empty 22690 1727204251.16009: checking for any_errors_fatal 22690 1727204251.16020: done checking for any_errors_fatal 22690 1727204251.16021: checking for max_fail_percentage 22690 1727204251.16023: done checking for max_fail_percentage 22690 1727204251.16024: checking to see if all hosts have failed and the running result is not ok 22690 1727204251.16025: done checking to see if all hosts have failed 22690 1727204251.16025: getting the remaining hosts for this loop 22690 1727204251.16027: done getting the remaining hosts for this loop 22690 1727204251.16031: getting the next task for host managed-node2 22690 1727204251.16037: done getting next task for host managed-node2 22690 1727204251.16041: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204251.16043: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204251.16057: getting variables 22690 1727204251.16059: in VariableManager get_vars() 22690 1727204251.16100: Calling all_inventory to load vars for managed-node2 22690 1727204251.16103: Calling groups_inventory to load vars for managed-node2 22690 1727204251.16105: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204251.16115: Calling all_plugins_play to load vars for managed-node2 22690 1727204251.16120: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204251.16123: Calling groups_plugins_play to load vars for managed-node2 22690 1727204251.17754: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204251.18921: done with get_vars() 22690 1727204251.18947: done getting variables 22690 1727204251.19002: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:31 -0400 (0:00:00.088) 0:00:18.473 ***** 22690 1727204251.19027: entering _queue_task() for managed-node2/service 22690 1727204251.19305: worker is 1 (out of 1 available) 22690 1727204251.19321: exiting _queue_task() for managed-node2/service 22690 1727204251.19334: done queuing things up, now waiting for results queue to drain 22690 1727204251.19336: waiting for pending results... 22690 1727204251.19696: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204251.19773: in run() - task 127b8e07-fff9-78bb-bf56-000000000025 22690 1727204251.19777: variable 'ansible_search_path' from source: unknown 22690 1727204251.19781: variable 'ansible_search_path' from source: unknown 22690 1727204251.19805: calling self._execute() 22690 1727204251.19925: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204251.19938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204251.19953: variable 'omit' from source: magic vars 22690 1727204251.20375: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.20596: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204251.20640: variable 'network_provider' from source: set_fact 22690 1727204251.20643: variable 'network_state' from source: role '' defaults 22690 1727204251.20664: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22690 1727204251.20692: variable 'omit' from source: magic vars 22690 1727204251.20715: variable 'omit' from source: magic vars 22690 1727204251.20740: variable 'network_service_name' from source: role '' defaults 22690 1727204251.20805: variable 'network_service_name' from source: role '' defaults 22690 1727204251.20887: variable '__network_provider_setup' from source: role '' defaults 22690 1727204251.20892: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204251.20940: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204251.20948: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204251.21000: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204251.21170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204251.22778: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204251.22842: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204251.22875: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204251.22902: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204251.22924: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204251.22994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.23020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.23038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.23073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.23084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.23122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.23138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.23157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.23189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.23201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.23363: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204251.23456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.23475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.23496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.23527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.23538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.23612: variable 'ansible_python' from source: facts 22690 1727204251.23633: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204251.23700: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204251.23760: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204251.23855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.23875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.23893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.23923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.23936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.23976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204251.23997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204251.24018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.24046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204251.24060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204251.24164: variable 'network_connections' from source: play vars 22690 1727204251.24171: variable 'interface' from source: set_fact 22690 1727204251.24225: variable 'interface' from source: set_fact 22690 1727204251.24235: variable 'interface' from source: set_fact 22690 1727204251.24292: variable 'interface' from source: set_fact 22690 1727204251.24374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204251.24525: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204251.24562: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204251.24599: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204251.24630: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204251.24679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204251.24704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204251.24729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204251.24753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204251.24794: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204251.24989: variable 'network_connections' from source: play vars 22690 1727204251.24995: variable 'interface' from source: set_fact 22690 1727204251.25054: variable 'interface' from source: set_fact 22690 1727204251.25062: variable 'interface' from source: set_fact 22690 1727204251.25123: variable 'interface' from source: set_fact 22690 1727204251.25161: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204251.25222: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204251.25422: variable 'network_connections' from source: play vars 22690 1727204251.25426: variable 'interface' from source: set_fact 22690 1727204251.25558: variable 'interface' from source: set_fact 22690 1727204251.25563: variable 'interface' from source: set_fact 22690 1727204251.25566: variable 'interface' from source: set_fact 22690 1727204251.25568: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204251.25613: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204251.25810: variable 'network_connections' from source: play vars 22690 1727204251.25813: variable 'interface' from source: set_fact 22690 1727204251.25867: variable 'interface' from source: set_fact 22690 1727204251.25872: variable 'interface' from source: set_fact 22690 1727204251.25926: variable 'interface' from source: set_fact 22690 1727204251.25974: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204251.26024: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204251.26030: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204251.26076: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204251.26227: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204251.26577: variable 'network_connections' from source: play vars 22690 1727204251.26580: variable 'interface' from source: set_fact 22690 1727204251.26625: variable 'interface' from source: set_fact 22690 1727204251.26630: variable 'interface' from source: set_fact 22690 1727204251.26679: variable 'interface' from source: set_fact 22690 1727204251.26686: variable 'ansible_distribution' from source: facts 22690 1727204251.26689: variable '__network_rh_distros' from source: role '' defaults 22690 1727204251.26695: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.26713: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204251.26838: variable 'ansible_distribution' from source: facts 22690 1727204251.26841: variable '__network_rh_distros' from source: role '' defaults 22690 1727204251.26846: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.26854: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204251.26974: variable 'ansible_distribution' from source: facts 22690 1727204251.26978: variable '__network_rh_distros' from source: role '' defaults 22690 1727204251.26981: variable 'ansible_distribution_major_version' from source: facts 22690 1727204251.27011: variable 'network_provider' from source: set_fact 22690 1727204251.27030: variable 'omit' from source: magic vars 22690 1727204251.27054: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204251.27080: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204251.27095: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204251.27111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204251.27123: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204251.27147: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204251.27150: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204251.27153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204251.27231: Set connection var ansible_connection to ssh 22690 1727204251.27239: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204251.27246: Set connection var ansible_pipelining to False 22690 1727204251.27249: Set connection var ansible_shell_type to sh 22690 1727204251.27255: Set connection var ansible_shell_executable to /bin/sh 22690 1727204251.27262: Set connection var ansible_timeout to 10 22690 1727204251.27283: variable 'ansible_shell_executable' from source: unknown 22690 1727204251.27286: variable 'ansible_connection' from source: unknown 22690 1727204251.27290: variable 'ansible_module_compression' from source: unknown 22690 1727204251.27292: variable 'ansible_shell_type' from source: unknown 22690 1727204251.27295: variable 'ansible_shell_executable' from source: unknown 22690 1727204251.27298: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204251.27306: variable 'ansible_pipelining' from source: unknown 22690 1727204251.27308: variable 'ansible_timeout' from source: unknown 22690 1727204251.27310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204251.27390: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204251.27398: variable 'omit' from source: magic vars 22690 1727204251.27403: starting attempt loop 22690 1727204251.27406: running the handler 22690 1727204251.27470: variable 'ansible_facts' from source: unknown 22690 1727204251.28169: _low_level_execute_command(): starting 22690 1727204251.28175: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204251.28719: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204251.28723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204251.28726: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204251.28728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204251.28777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204251.28789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204251.28902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204251.30646: stdout chunk (state=3): >>>/root <<< 22690 1727204251.30753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204251.30826: stderr chunk (state=3): >>><<< 22690 1727204251.30830: stdout chunk (state=3): >>><<< 22690 1727204251.30848: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204251.30861: _low_level_execute_command(): starting 22690 1727204251.30868: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274 `" && echo ansible-tmp-1727204251.308491-23802-197508535167274="` echo /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274 `" ) && sleep 0' 22690 1727204251.31376: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204251.31380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204251.31382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204251.31385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204251.31435: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204251.31442: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204251.31445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204251.31510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204251.33511: stdout chunk (state=3): >>>ansible-tmp-1727204251.308491-23802-197508535167274=/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274 <<< 22690 1727204251.33622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204251.33681: stderr chunk (state=3): >>><<< 22690 1727204251.33685: stdout chunk (state=3): >>><<< 22690 1727204251.33700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204251.308491-23802-197508535167274=/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204251.33732: variable 'ansible_module_compression' from source: unknown 22690 1727204251.33780: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 22690 1727204251.33784: ANSIBALLZ: Acquiring lock 22690 1727204251.33787: ANSIBALLZ: Lock acquired: 139846653776800 22690 1727204251.33789: ANSIBALLZ: Creating module 22690 1727204251.60853: ANSIBALLZ: Writing module into payload 22690 1727204251.61272: ANSIBALLZ: Writing module 22690 1727204251.61276: ANSIBALLZ: Renaming module 22690 1727204251.61278: ANSIBALLZ: Done creating module 22690 1727204251.61281: variable 'ansible_facts' from source: unknown 22690 1727204251.61360: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py 22690 1727204251.61640: Sending initial data 22690 1727204251.61652: Sent initial data (155 bytes) 22690 1727204251.62213: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204251.62328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204251.62357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204251.62468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204251.64201: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204251.64298: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204251.64370: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp5nyofsc0 /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py <<< 22690 1727204251.64391: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py" <<< 22690 1727204251.64507: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp5nyofsc0" to remote "/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py" <<< 22690 1727204251.66578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204251.66672: stderr chunk (state=3): >>><<< 22690 1727204251.66770: stdout chunk (state=3): >>><<< 22690 1727204251.66775: done transferring module to remote 22690 1727204251.66778: _low_level_execute_command(): starting 22690 1727204251.66780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/ /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py && sleep 0' 22690 1727204251.67436: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204251.67479: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204251.67493: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204251.67505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204251.67539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204251.67624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204251.67657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204251.67762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204251.69775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204251.69780: stderr chunk (state=3): >>><<< 22690 1727204251.69782: stdout chunk (state=3): >>><<< 22690 1727204251.69785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204251.69788: _low_level_execute_command(): starting 22690 1727204251.69790: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/AnsiballZ_systemd.py && sleep 0' 22690 1727204251.70494: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204251.70529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204251.70647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204251.70684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204251.70728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204251.70974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.02806: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530973184", "CPUUsageNSec": "884091000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 22690 1727204252.02853: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22690 1727204252.04713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204252.04768: stderr chunk (state=3): >>><<< 22690 1727204252.04772: stdout chunk (state=3): >>><<< 22690 1727204252.04789: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3530973184", "CPUUsageNSec": "884091000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204252.05137: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204252.05140: _low_level_execute_command(): starting 22690 1727204252.05143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204251.308491-23802-197508535167274/ > /dev/null 2>&1 && sleep 0' 22690 1727204252.05772: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204252.05796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204252.05987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.06007: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.06160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.08073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204252.08187: stderr chunk (state=3): >>><<< 22690 1727204252.08191: stdout chunk (state=3): >>><<< 22690 1727204252.08572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204252.08577: handler run complete 22690 1727204252.08580: attempt loop complete, returning result 22690 1727204252.08583: _execute() done 22690 1727204252.08585: dumping result to json 22690 1727204252.08587: done dumping result, returning 22690 1727204252.08589: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-78bb-bf56-000000000025] 22690 1727204252.08592: sending task result for task 127b8e07-fff9-78bb-bf56-000000000025 22690 1727204252.08972: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000025 22690 1727204252.08977: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204252.09035: no more pending results, returning what we have 22690 1727204252.09039: results queue empty 22690 1727204252.09040: checking for any_errors_fatal 22690 1727204252.09047: done checking for any_errors_fatal 22690 1727204252.09048: checking for max_fail_percentage 22690 1727204252.09050: done checking for max_fail_percentage 22690 1727204252.09051: checking to see if all hosts have failed and the running result is not ok 22690 1727204252.09052: done checking to see if all hosts have failed 22690 1727204252.09053: getting the remaining hosts for this loop 22690 1727204252.09055: done getting the remaining hosts for this loop 22690 1727204252.09059: getting the next task for host managed-node2 22690 1727204252.09070: done getting next task for host managed-node2 22690 1727204252.09074: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204252.09077: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204252.09089: getting variables 22690 1727204252.09091: in VariableManager get_vars() 22690 1727204252.09132: Calling all_inventory to load vars for managed-node2 22690 1727204252.09135: Calling groups_inventory to load vars for managed-node2 22690 1727204252.09138: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204252.09151: Calling all_plugins_play to load vars for managed-node2 22690 1727204252.09154: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204252.09158: Calling groups_plugins_play to load vars for managed-node2 22690 1727204252.11387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204252.12830: done with get_vars() 22690 1727204252.12859: done getting variables 22690 1727204252.12913: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.939) 0:00:19.412 ***** 22690 1727204252.12937: entering _queue_task() for managed-node2/service 22690 1727204252.13213: worker is 1 (out of 1 available) 22690 1727204252.13228: exiting _queue_task() for managed-node2/service 22690 1727204252.13244: done queuing things up, now waiting for results queue to drain 22690 1727204252.13246: waiting for pending results... 22690 1727204252.13432: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204252.13516: in run() - task 127b8e07-fff9-78bb-bf56-000000000026 22690 1727204252.13531: variable 'ansible_search_path' from source: unknown 22690 1727204252.13534: variable 'ansible_search_path' from source: unknown 22690 1727204252.13569: calling self._execute() 22690 1727204252.13654: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.13658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.13669: variable 'omit' from source: magic vars 22690 1727204252.13981: variable 'ansible_distribution_major_version' from source: facts 22690 1727204252.14026: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204252.14125: variable 'network_provider' from source: set_fact 22690 1727204252.14134: Evaluated conditional (network_provider == "nm"): True 22690 1727204252.14271: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204252.14373: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204252.14571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204252.16373: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204252.16427: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204252.16459: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204252.16487: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204252.16509: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204252.16594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204252.16616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204252.16638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204252.16672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204252.16683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204252.16720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204252.16739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204252.16757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204252.16791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204252.16803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204252.16837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204252.16854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204252.16875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204252.16905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204252.16916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204252.17026: variable 'network_connections' from source: play vars 22690 1727204252.17038: variable 'interface' from source: set_fact 22690 1727204252.17100: variable 'interface' from source: set_fact 22690 1727204252.17109: variable 'interface' from source: set_fact 22690 1727204252.17156: variable 'interface' from source: set_fact 22690 1727204252.17218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204252.17340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204252.17370: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204252.17393: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204252.17417: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204252.17453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204252.17470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204252.17489: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204252.17507: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204252.17550: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204252.17735: variable 'network_connections' from source: play vars 22690 1727204252.17740: variable 'interface' from source: set_fact 22690 1727204252.17789: variable 'interface' from source: set_fact 22690 1727204252.17795: variable 'interface' from source: set_fact 22690 1727204252.17843: variable 'interface' from source: set_fact 22690 1727204252.17880: Evaluated conditional (__network_wpa_supplicant_required): False 22690 1727204252.17883: when evaluation is False, skipping this task 22690 1727204252.17886: _execute() done 22690 1727204252.17897: dumping result to json 22690 1727204252.17900: done dumping result, returning 22690 1727204252.17903: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-78bb-bf56-000000000026] 22690 1727204252.17905: sending task result for task 127b8e07-fff9-78bb-bf56-000000000026 22690 1727204252.18000: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000026 22690 1727204252.18003: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22690 1727204252.18052: no more pending results, returning what we have 22690 1727204252.18056: results queue empty 22690 1727204252.18057: checking for any_errors_fatal 22690 1727204252.18089: done checking for any_errors_fatal 22690 1727204252.18090: checking for max_fail_percentage 22690 1727204252.18092: done checking for max_fail_percentage 22690 1727204252.18093: checking to see if all hosts have failed and the running result is not ok 22690 1727204252.18094: done checking to see if all hosts have failed 22690 1727204252.18095: getting the remaining hosts for this loop 22690 1727204252.18096: done getting the remaining hosts for this loop 22690 1727204252.18100: getting the next task for host managed-node2 22690 1727204252.18106: done getting next task for host managed-node2 22690 1727204252.18111: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204252.18114: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204252.18129: getting variables 22690 1727204252.18130: in VariableManager get_vars() 22690 1727204252.18179: Calling all_inventory to load vars for managed-node2 22690 1727204252.18182: Calling groups_inventory to load vars for managed-node2 22690 1727204252.18184: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204252.18194: Calling all_plugins_play to load vars for managed-node2 22690 1727204252.18197: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204252.18199: Calling groups_plugins_play to load vars for managed-node2 22690 1727204252.19217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204252.20400: done with get_vars() 22690 1727204252.20429: done getting variables 22690 1727204252.20485: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.075) 0:00:19.488 ***** 22690 1727204252.20510: entering _queue_task() for managed-node2/service 22690 1727204252.20783: worker is 1 (out of 1 available) 22690 1727204252.20797: exiting _queue_task() for managed-node2/service 22690 1727204252.20813: done queuing things up, now waiting for results queue to drain 22690 1727204252.20814: waiting for pending results... 22690 1727204252.21011: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204252.21084: in run() - task 127b8e07-fff9-78bb-bf56-000000000027 22690 1727204252.21097: variable 'ansible_search_path' from source: unknown 22690 1727204252.21101: variable 'ansible_search_path' from source: unknown 22690 1727204252.21136: calling self._execute() 22690 1727204252.21214: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.21221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.21231: variable 'omit' from source: magic vars 22690 1727204252.21544: variable 'ansible_distribution_major_version' from source: facts 22690 1727204252.21554: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204252.21647: variable 'network_provider' from source: set_fact 22690 1727204252.21651: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204252.21654: when evaluation is False, skipping this task 22690 1727204252.21658: _execute() done 22690 1727204252.21662: dumping result to json 22690 1727204252.21664: done dumping result, returning 22690 1727204252.21674: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-78bb-bf56-000000000027] 22690 1727204252.21679: sending task result for task 127b8e07-fff9-78bb-bf56-000000000027 22690 1727204252.21774: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000027 22690 1727204252.21777: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204252.21839: no more pending results, returning what we have 22690 1727204252.21844: results queue empty 22690 1727204252.21844: checking for any_errors_fatal 22690 1727204252.21853: done checking for any_errors_fatal 22690 1727204252.21853: checking for max_fail_percentage 22690 1727204252.21856: done checking for max_fail_percentage 22690 1727204252.21857: checking to see if all hosts have failed and the running result is not ok 22690 1727204252.21858: done checking to see if all hosts have failed 22690 1727204252.21858: getting the remaining hosts for this loop 22690 1727204252.21860: done getting the remaining hosts for this loop 22690 1727204252.21864: getting the next task for host managed-node2 22690 1727204252.21872: done getting next task for host managed-node2 22690 1727204252.21876: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204252.21879: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204252.21894: getting variables 22690 1727204252.21895: in VariableManager get_vars() 22690 1727204252.21932: Calling all_inventory to load vars for managed-node2 22690 1727204252.21934: Calling groups_inventory to load vars for managed-node2 22690 1727204252.21936: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204252.21946: Calling all_plugins_play to load vars for managed-node2 22690 1727204252.21949: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204252.21951: Calling groups_plugins_play to load vars for managed-node2 22690 1727204252.23073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204252.24242: done with get_vars() 22690 1727204252.24275: done getting variables 22690 1727204252.24327: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.038) 0:00:19.526 ***** 22690 1727204252.24352: entering _queue_task() for managed-node2/copy 22690 1727204252.24633: worker is 1 (out of 1 available) 22690 1727204252.24649: exiting _queue_task() for managed-node2/copy 22690 1727204252.24662: done queuing things up, now waiting for results queue to drain 22690 1727204252.24664: waiting for pending results... 22690 1727204252.24854: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204252.24934: in run() - task 127b8e07-fff9-78bb-bf56-000000000028 22690 1727204252.24948: variable 'ansible_search_path' from source: unknown 22690 1727204252.24952: variable 'ansible_search_path' from source: unknown 22690 1727204252.24986: calling self._execute() 22690 1727204252.25067: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.25071: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.25082: variable 'omit' from source: magic vars 22690 1727204252.25384: variable 'ansible_distribution_major_version' from source: facts 22690 1727204252.25396: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204252.25484: variable 'network_provider' from source: set_fact 22690 1727204252.25489: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204252.25492: when evaluation is False, skipping this task 22690 1727204252.25496: _execute() done 22690 1727204252.25498: dumping result to json 22690 1727204252.25501: done dumping result, returning 22690 1727204252.25510: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-78bb-bf56-000000000028] 22690 1727204252.25514: sending task result for task 127b8e07-fff9-78bb-bf56-000000000028 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22690 1727204252.25673: no more pending results, returning what we have 22690 1727204252.25679: results queue empty 22690 1727204252.25680: checking for any_errors_fatal 22690 1727204252.25687: done checking for any_errors_fatal 22690 1727204252.25688: checking for max_fail_percentage 22690 1727204252.25690: done checking for max_fail_percentage 22690 1727204252.25691: checking to see if all hosts have failed and the running result is not ok 22690 1727204252.25692: done checking to see if all hosts have failed 22690 1727204252.25693: getting the remaining hosts for this loop 22690 1727204252.25694: done getting the remaining hosts for this loop 22690 1727204252.25698: getting the next task for host managed-node2 22690 1727204252.25704: done getting next task for host managed-node2 22690 1727204252.25708: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204252.25711: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204252.25728: getting variables 22690 1727204252.25729: in VariableManager get_vars() 22690 1727204252.25770: Calling all_inventory to load vars for managed-node2 22690 1727204252.25773: Calling groups_inventory to load vars for managed-node2 22690 1727204252.25775: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204252.25785: Calling all_plugins_play to load vars for managed-node2 22690 1727204252.25788: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204252.25790: Calling groups_plugins_play to load vars for managed-node2 22690 1727204252.26384: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000028 22690 1727204252.26388: WORKER PROCESS EXITING 22690 1727204252.26909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204252.28079: done with get_vars() 22690 1727204252.28107: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:32 -0400 (0:00:00.038) 0:00:19.565 ***** 22690 1727204252.28181: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204252.28182: Creating lock for fedora.linux_system_roles.network_connections 22690 1727204252.28469: worker is 1 (out of 1 available) 22690 1727204252.28483: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204252.28498: done queuing things up, now waiting for results queue to drain 22690 1727204252.28499: waiting for pending results... 22690 1727204252.28697: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204252.28778: in run() - task 127b8e07-fff9-78bb-bf56-000000000029 22690 1727204252.28791: variable 'ansible_search_path' from source: unknown 22690 1727204252.28795: variable 'ansible_search_path' from source: unknown 22690 1727204252.28834: calling self._execute() 22690 1727204252.28914: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.28922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.28931: variable 'omit' from source: magic vars 22690 1727204252.29238: variable 'ansible_distribution_major_version' from source: facts 22690 1727204252.29250: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204252.29257: variable 'omit' from source: magic vars 22690 1727204252.29292: variable 'omit' from source: magic vars 22690 1727204252.29427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204252.31067: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204252.31118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204252.31153: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204252.31182: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204252.31204: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204252.31274: variable 'network_provider' from source: set_fact 22690 1727204252.31385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204252.31419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204252.31440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204252.31474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204252.31486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204252.31546: variable 'omit' from source: magic vars 22690 1727204252.31638: variable 'omit' from source: magic vars 22690 1727204252.31717: variable 'network_connections' from source: play vars 22690 1727204252.31730: variable 'interface' from source: set_fact 22690 1727204252.31784: variable 'interface' from source: set_fact 22690 1727204252.31792: variable 'interface' from source: set_fact 22690 1727204252.31840: variable 'interface' from source: set_fact 22690 1727204252.31959: variable 'omit' from source: magic vars 22690 1727204252.31968: variable '__lsr_ansible_managed' from source: task vars 22690 1727204252.32018: variable '__lsr_ansible_managed' from source: task vars 22690 1727204252.32234: Loaded config def from plugin (lookup/template) 22690 1727204252.32238: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22690 1727204252.32257: File lookup term: get_ansible_managed.j2 22690 1727204252.32261: variable 'ansible_search_path' from source: unknown 22690 1727204252.32268: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22690 1727204252.32280: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22690 1727204252.32294: variable 'ansible_search_path' from source: unknown 22690 1727204252.37210: variable 'ansible_managed' from source: unknown 22690 1727204252.37313: variable 'omit' from source: magic vars 22690 1727204252.37342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204252.37363: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204252.37383: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204252.37397: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204252.37406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204252.37431: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204252.37435: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.37437: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.37507: Set connection var ansible_connection to ssh 22690 1727204252.37515: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204252.37525: Set connection var ansible_pipelining to False 22690 1727204252.37528: Set connection var ansible_shell_type to sh 22690 1727204252.37533: Set connection var ansible_shell_executable to /bin/sh 22690 1727204252.37540: Set connection var ansible_timeout to 10 22690 1727204252.37561: variable 'ansible_shell_executable' from source: unknown 22690 1727204252.37564: variable 'ansible_connection' from source: unknown 22690 1727204252.37568: variable 'ansible_module_compression' from source: unknown 22690 1727204252.37571: variable 'ansible_shell_type' from source: unknown 22690 1727204252.37574: variable 'ansible_shell_executable' from source: unknown 22690 1727204252.37576: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204252.37579: variable 'ansible_pipelining' from source: unknown 22690 1727204252.37581: variable 'ansible_timeout' from source: unknown 22690 1727204252.37588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204252.37691: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204252.37705: variable 'omit' from source: magic vars 22690 1727204252.37708: starting attempt loop 22690 1727204252.37711: running the handler 22690 1727204252.37726: _low_level_execute_command(): starting 22690 1727204252.37733: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204252.38459: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204252.38470: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204252.38481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.38496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204252.38567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204252.38574: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204252.38580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.38583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204252.38585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204252.38587: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204252.38590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204252.38592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.38594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204252.38680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204252.38685: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204252.38688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.38690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204252.38692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.38790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.38969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.40749: stdout chunk (state=3): >>>/root <<< 22690 1727204252.41062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204252.41071: stdout chunk (state=3): >>><<< 22690 1727204252.41074: stderr chunk (state=3): >>><<< 22690 1727204252.41077: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204252.41079: _low_level_execute_command(): starting 22690 1727204252.41082: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953 `" && echo ansible-tmp-1727204252.409599-23837-142122974974953="` echo /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953 `" ) && sleep 0' 22690 1727204252.41769: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.41775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.41800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.41902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.44113: stdout chunk (state=3): >>>ansible-tmp-1727204252.409599-23837-142122974974953=/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953 <<< 22690 1727204252.44117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204252.44120: stdout chunk (state=3): >>><<< 22690 1727204252.44122: stderr chunk (state=3): >>><<< 22690 1727204252.44272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204252.409599-23837-142122974974953=/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204252.44276: variable 'ansible_module_compression' from source: unknown 22690 1727204252.44279: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 22690 1727204252.44281: ANSIBALLZ: Acquiring lock 22690 1727204252.44283: ANSIBALLZ: Lock acquired: 139846649977632 22690 1727204252.44286: ANSIBALLZ: Creating module 22690 1727204252.66835: ANSIBALLZ: Writing module into payload 22690 1727204252.67070: ANSIBALLZ: Writing module 22690 1727204252.67094: ANSIBALLZ: Renaming module 22690 1727204252.67105: ANSIBALLZ: Done creating module 22690 1727204252.67126: variable 'ansible_facts' from source: unknown 22690 1727204252.67197: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py 22690 1727204252.67314: Sending initial data 22690 1727204252.67318: Sent initial data (167 bytes) 22690 1727204252.67839: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.67844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.67846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.67849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.67895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204252.67899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.67901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.67979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.69633: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204252.69699: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204252.69773: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpxtrx2iom /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py <<< 22690 1727204252.69777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py" <<< 22690 1727204252.69837: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpxtrx2iom" to remote "/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py" <<< 22690 1727204252.69842: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py" <<< 22690 1727204252.70692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204252.70768: stderr chunk (state=3): >>><<< 22690 1727204252.70773: stdout chunk (state=3): >>><<< 22690 1727204252.70795: done transferring module to remote 22690 1727204252.70807: _low_level_execute_command(): starting 22690 1727204252.70812: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/ /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py && sleep 0' 22690 1727204252.71300: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204252.71306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.71309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.71332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.71380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204252.71384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.71387: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.71464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204252.73298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204252.73356: stderr chunk (state=3): >>><<< 22690 1727204252.73362: stdout chunk (state=3): >>><<< 22690 1727204252.73381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204252.73385: _low_level_execute_command(): starting 22690 1727204252.73389: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/AnsiballZ_network_connections.py && sleep 0' 22690 1727204252.73859: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204252.73863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204252.73880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.73902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204252.73905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204252.73963: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204252.73970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204252.73972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204252.74052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.20754: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 22690 1727204253.20875: stdout chunk (state=3): >>> <<< 22690 1727204253.23000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204253.23042: stderr chunk (state=3): >>><<< 22690 1727204253.23052: stdout chunk (state=3): >>><<< 22690 1727204253.23083: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204253.23279: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204253.23282: _low_level_execute_command(): starting 22690 1727204253.23285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204252.409599-23837-142122974974953/ > /dev/null 2>&1 && sleep 0' 22690 1727204253.24294: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.24595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.24614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.24713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.26794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204253.26800: stdout chunk (state=3): >>><<< 22690 1727204253.26803: stderr chunk (state=3): >>><<< 22690 1727204253.26824: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204253.26885: handler run complete 22690 1727204253.27021: attempt loop complete, returning result 22690 1727204253.27377: _execute() done 22690 1727204253.27381: dumping result to json 22690 1727204253.27383: done dumping result, returning 22690 1727204253.27386: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-78bb-bf56-000000000029] 22690 1727204253.27389: sending task result for task 127b8e07-fff9-78bb-bf56-000000000029 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active) 22690 1727204253.27710: no more pending results, returning what we have 22690 1727204253.27714: results queue empty 22690 1727204253.27718: checking for any_errors_fatal 22690 1727204253.27724: done checking for any_errors_fatal 22690 1727204253.27725: checking for max_fail_percentage 22690 1727204253.27727: done checking for max_fail_percentage 22690 1727204253.27727: checking to see if all hosts have failed and the running result is not ok 22690 1727204253.27728: done checking to see if all hosts have failed 22690 1727204253.27729: getting the remaining hosts for this loop 22690 1727204253.27730: done getting the remaining hosts for this loop 22690 1727204253.27735: getting the next task for host managed-node2 22690 1727204253.27741: done getting next task for host managed-node2 22690 1727204253.27746: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204253.27748: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204253.27759: getting variables 22690 1727204253.27761: in VariableManager get_vars() 22690 1727204253.28214: Calling all_inventory to load vars for managed-node2 22690 1727204253.28220: Calling groups_inventory to load vars for managed-node2 22690 1727204253.28223: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204253.28237: Calling all_plugins_play to load vars for managed-node2 22690 1727204253.28240: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204253.28244: Calling groups_plugins_play to load vars for managed-node2 22690 1727204253.29377: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000029 22690 1727204253.29382: WORKER PROCESS EXITING 22690 1727204253.31196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204253.33563: done with get_vars() 22690 1727204253.33608: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:33 -0400 (0:00:01.055) 0:00:20.620 ***** 22690 1727204253.33713: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204253.33718: Creating lock for fedora.linux_system_roles.network_state 22690 1727204253.34273: worker is 1 (out of 1 available) 22690 1727204253.34288: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204253.34301: done queuing things up, now waiting for results queue to drain 22690 1727204253.34307: waiting for pending results... 22690 1727204253.34550: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204253.34674: in run() - task 127b8e07-fff9-78bb-bf56-00000000002a 22690 1727204253.34701: variable 'ansible_search_path' from source: unknown 22690 1727204253.34709: variable 'ansible_search_path' from source: unknown 22690 1727204253.34762: calling self._execute() 22690 1727204253.34880: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.34895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.34911: variable 'omit' from source: magic vars 22690 1727204253.35340: variable 'ansible_distribution_major_version' from source: facts 22690 1727204253.35358: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204253.35512: variable 'network_state' from source: role '' defaults 22690 1727204253.35532: Evaluated conditional (network_state != {}): False 22690 1727204253.35539: when evaluation is False, skipping this task 22690 1727204253.35546: _execute() done 22690 1727204253.35556: dumping result to json 22690 1727204253.35563: done dumping result, returning 22690 1727204253.35576: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-78bb-bf56-00000000002a] 22690 1727204253.35586: sending task result for task 127b8e07-fff9-78bb-bf56-00000000002a skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204253.35886: no more pending results, returning what we have 22690 1727204253.35891: results queue empty 22690 1727204253.35892: checking for any_errors_fatal 22690 1727204253.35902: done checking for any_errors_fatal 22690 1727204253.35903: checking for max_fail_percentage 22690 1727204253.35905: done checking for max_fail_percentage 22690 1727204253.35906: checking to see if all hosts have failed and the running result is not ok 22690 1727204253.35907: done checking to see if all hosts have failed 22690 1727204253.35908: getting the remaining hosts for this loop 22690 1727204253.35909: done getting the remaining hosts for this loop 22690 1727204253.35913: getting the next task for host managed-node2 22690 1727204253.35922: done getting next task for host managed-node2 22690 1727204253.35927: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204253.35930: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204253.35944: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000002a 22690 1727204253.35947: WORKER PROCESS EXITING 22690 1727204253.36177: getting variables 22690 1727204253.36179: in VariableManager get_vars() 22690 1727204253.36220: Calling all_inventory to load vars for managed-node2 22690 1727204253.36223: Calling groups_inventory to load vars for managed-node2 22690 1727204253.36225: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204253.36236: Calling all_plugins_play to load vars for managed-node2 22690 1727204253.36238: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204253.36241: Calling groups_plugins_play to load vars for managed-node2 22690 1727204253.38346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204253.41297: done with get_vars() 22690 1727204253.41347: done getting variables 22690 1727204253.41434: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.077) 0:00:20.698 ***** 22690 1727204253.41470: entering _queue_task() for managed-node2/debug 22690 1727204253.41858: worker is 1 (out of 1 available) 22690 1727204253.41875: exiting _queue_task() for managed-node2/debug 22690 1727204253.41890: done queuing things up, now waiting for results queue to drain 22690 1727204253.41892: waiting for pending results... 22690 1727204253.42171: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204253.42285: in run() - task 127b8e07-fff9-78bb-bf56-00000000002b 22690 1727204253.42311: variable 'ansible_search_path' from source: unknown 22690 1727204253.42319: variable 'ansible_search_path' from source: unknown 22690 1727204253.42367: calling self._execute() 22690 1727204253.42471: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.42485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.42498: variable 'omit' from source: magic vars 22690 1727204253.42881: variable 'ansible_distribution_major_version' from source: facts 22690 1727204253.43071: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204253.43075: variable 'omit' from source: magic vars 22690 1727204253.43077: variable 'omit' from source: magic vars 22690 1727204253.43080: variable 'omit' from source: magic vars 22690 1727204253.43082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204253.43138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204253.43177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204253.43202: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.43220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.43374: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204253.43382: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.43389: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.43564: Set connection var ansible_connection to ssh 22690 1727204253.43570: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204253.43573: Set connection var ansible_pipelining to False 22690 1727204253.43576: Set connection var ansible_shell_type to sh 22690 1727204253.43578: Set connection var ansible_shell_executable to /bin/sh 22690 1727204253.43580: Set connection var ansible_timeout to 10 22690 1727204253.43581: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.43583: variable 'ansible_connection' from source: unknown 22690 1727204253.43609: variable 'ansible_module_compression' from source: unknown 22690 1727204253.43616: variable 'ansible_shell_type' from source: unknown 22690 1727204253.43622: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.43628: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.43633: variable 'ansible_pipelining' from source: unknown 22690 1727204253.43638: variable 'ansible_timeout' from source: unknown 22690 1727204253.43644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.43892: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204253.43895: variable 'omit' from source: magic vars 22690 1727204253.43898: starting attempt loop 22690 1727204253.43900: running the handler 22690 1727204253.44042: variable '__network_connections_result' from source: set_fact 22690 1727204253.44112: handler run complete 22690 1727204253.44138: attempt loop complete, returning result 22690 1727204253.44147: _execute() done 22690 1727204253.44154: dumping result to json 22690 1727204253.44162: done dumping result, returning 22690 1727204253.44178: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000002b] 22690 1727204253.44186: sending task result for task 127b8e07-fff9-78bb-bf56-00000000002b ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active)" ] } 22690 1727204253.44395: no more pending results, returning what we have 22690 1727204253.44401: results queue empty 22690 1727204253.44402: checking for any_errors_fatal 22690 1727204253.44410: done checking for any_errors_fatal 22690 1727204253.44411: checking for max_fail_percentage 22690 1727204253.44413: done checking for max_fail_percentage 22690 1727204253.44414: checking to see if all hosts have failed and the running result is not ok 22690 1727204253.44415: done checking to see if all hosts have failed 22690 1727204253.44416: getting the remaining hosts for this loop 22690 1727204253.44417: done getting the remaining hosts for this loop 22690 1727204253.44422: getting the next task for host managed-node2 22690 1727204253.44429: done getting next task for host managed-node2 22690 1727204253.44433: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204253.44436: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204253.44447: getting variables 22690 1727204253.44449: in VariableManager get_vars() 22690 1727204253.44713: Calling all_inventory to load vars for managed-node2 22690 1727204253.44717: Calling groups_inventory to load vars for managed-node2 22690 1727204253.44720: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204253.44735: Calling all_plugins_play to load vars for managed-node2 22690 1727204253.44738: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204253.44742: Calling groups_plugins_play to load vars for managed-node2 22690 1727204253.45371: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000002b 22690 1727204253.45377: WORKER PROCESS EXITING 22690 1727204253.46978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204253.49354: done with get_vars() 22690 1727204253.49405: done getting variables 22690 1727204253.49491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.080) 0:00:20.778 ***** 22690 1727204253.49529: entering _queue_task() for managed-node2/debug 22690 1727204253.49951: worker is 1 (out of 1 available) 22690 1727204253.49969: exiting _queue_task() for managed-node2/debug 22690 1727204253.49986: done queuing things up, now waiting for results queue to drain 22690 1727204253.49987: waiting for pending results... 22690 1727204253.50305: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204253.50428: in run() - task 127b8e07-fff9-78bb-bf56-00000000002c 22690 1727204253.50445: variable 'ansible_search_path' from source: unknown 22690 1727204253.50449: variable 'ansible_search_path' from source: unknown 22690 1727204253.50501: calling self._execute() 22690 1727204253.50629: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.50636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.50691: variable 'omit' from source: magic vars 22690 1727204253.51576: variable 'ansible_distribution_major_version' from source: facts 22690 1727204253.51587: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204253.51596: variable 'omit' from source: magic vars 22690 1727204253.51691: variable 'omit' from source: magic vars 22690 1727204253.51738: variable 'omit' from source: magic vars 22690 1727204253.51793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204253.51851: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204253.51880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204253.51905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.51930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.51978: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204253.51988: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.51996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.52122: Set connection var ansible_connection to ssh 22690 1727204253.52141: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204253.52154: Set connection var ansible_pipelining to False 22690 1727204253.52167: Set connection var ansible_shell_type to sh 22690 1727204253.52183: Set connection var ansible_shell_executable to /bin/sh 22690 1727204253.52196: Set connection var ansible_timeout to 10 22690 1727204253.52229: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.52237: variable 'ansible_connection' from source: unknown 22690 1727204253.52272: variable 'ansible_module_compression' from source: unknown 22690 1727204253.52276: variable 'ansible_shell_type' from source: unknown 22690 1727204253.52283: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.52286: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.52288: variable 'ansible_pipelining' from source: unknown 22690 1727204253.52290: variable 'ansible_timeout' from source: unknown 22690 1727204253.52292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.52496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204253.52509: variable 'omit' from source: magic vars 22690 1727204253.52604: starting attempt loop 22690 1727204253.52609: running the handler 22690 1727204253.52612: variable '__network_connections_result' from source: set_fact 22690 1727204253.52713: variable '__network_connections_result' from source: set_fact 22690 1727204253.52931: handler run complete 22690 1727204253.52955: attempt loop complete, returning result 22690 1727204253.52962: _execute() done 22690 1727204253.52972: dumping result to json 22690 1727204253.53040: done dumping result, returning 22690 1727204253.53047: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000002c] 22690 1727204253.53054: sending task result for task 127b8e07-fff9-78bb-bf56-00000000002c ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 5a4ab182-4b5b-42b9-9199-87bcb8efcb93 (not-active)" ] } } 22690 1727204253.53393: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000002c 22690 1727204253.53397: WORKER PROCESS EXITING 22690 1727204253.53410: no more pending results, returning what we have 22690 1727204253.53418: results queue empty 22690 1727204253.53419: checking for any_errors_fatal 22690 1727204253.53425: done checking for any_errors_fatal 22690 1727204253.53426: checking for max_fail_percentage 22690 1727204253.53428: done checking for max_fail_percentage 22690 1727204253.53429: checking to see if all hosts have failed and the running result is not ok 22690 1727204253.53431: done checking to see if all hosts have failed 22690 1727204253.53431: getting the remaining hosts for this loop 22690 1727204253.53433: done getting the remaining hosts for this loop 22690 1727204253.53437: getting the next task for host managed-node2 22690 1727204253.53444: done getting next task for host managed-node2 22690 1727204253.53448: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204253.53451: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204253.53462: getting variables 22690 1727204253.53464: in VariableManager get_vars() 22690 1727204253.53580: Calling all_inventory to load vars for managed-node2 22690 1727204253.53583: Calling groups_inventory to load vars for managed-node2 22690 1727204253.53585: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204253.53599: Calling all_plugins_play to load vars for managed-node2 22690 1727204253.53602: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204253.53605: Calling groups_plugins_play to load vars for managed-node2 22690 1727204253.54635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204253.56530: done with get_vars() 22690 1727204253.56572: done getting variables 22690 1727204253.56648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.071) 0:00:20.850 ***** 22690 1727204253.56684: entering _queue_task() for managed-node2/debug 22690 1727204253.57120: worker is 1 (out of 1 available) 22690 1727204253.57141: exiting _queue_task() for managed-node2/debug 22690 1727204253.57157: done queuing things up, now waiting for results queue to drain 22690 1727204253.57158: waiting for pending results... 22690 1727204253.57423: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204253.57545: in run() - task 127b8e07-fff9-78bb-bf56-00000000002d 22690 1727204253.57551: variable 'ansible_search_path' from source: unknown 22690 1727204253.57554: variable 'ansible_search_path' from source: unknown 22690 1727204253.57583: calling self._execute() 22690 1727204253.57691: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.57696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.57703: variable 'omit' from source: magic vars 22690 1727204253.58018: variable 'ansible_distribution_major_version' from source: facts 22690 1727204253.58029: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204253.58119: variable 'network_state' from source: role '' defaults 22690 1727204253.58126: Evaluated conditional (network_state != {}): False 22690 1727204253.58134: when evaluation is False, skipping this task 22690 1727204253.58141: _execute() done 22690 1727204253.58144: dumping result to json 22690 1727204253.58147: done dumping result, returning 22690 1727204253.58150: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-78bb-bf56-00000000002d] 22690 1727204253.58152: sending task result for task 127b8e07-fff9-78bb-bf56-00000000002d skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22690 1727204253.58306: no more pending results, returning what we have 22690 1727204253.58311: results queue empty 22690 1727204253.58311: checking for any_errors_fatal 22690 1727204253.58324: done checking for any_errors_fatal 22690 1727204253.58325: checking for max_fail_percentage 22690 1727204253.58327: done checking for max_fail_percentage 22690 1727204253.58328: checking to see if all hosts have failed and the running result is not ok 22690 1727204253.58329: done checking to see if all hosts have failed 22690 1727204253.58330: getting the remaining hosts for this loop 22690 1727204253.58331: done getting the remaining hosts for this loop 22690 1727204253.58335: getting the next task for host managed-node2 22690 1727204253.58342: done getting next task for host managed-node2 22690 1727204253.58348: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204253.58352: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204253.58368: getting variables 22690 1727204253.58370: in VariableManager get_vars() 22690 1727204253.58409: Calling all_inventory to load vars for managed-node2 22690 1727204253.58411: Calling groups_inventory to load vars for managed-node2 22690 1727204253.58413: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204253.58426: Calling all_plugins_play to load vars for managed-node2 22690 1727204253.58429: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204253.58432: Calling groups_plugins_play to load vars for managed-node2 22690 1727204253.59630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204253.60174: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000002d 22690 1727204253.60180: WORKER PROCESS EXITING 22690 1727204253.61221: done with get_vars() 22690 1727204253.61252: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.046) 0:00:20.896 ***** 22690 1727204253.61335: entering _queue_task() for managed-node2/ping 22690 1727204253.61336: Creating lock for ping 22690 1727204253.61625: worker is 1 (out of 1 available) 22690 1727204253.61638: exiting _queue_task() for managed-node2/ping 22690 1727204253.61651: done queuing things up, now waiting for results queue to drain 22690 1727204253.61653: waiting for pending results... 22690 1727204253.61846: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204253.61919: in run() - task 127b8e07-fff9-78bb-bf56-00000000002e 22690 1727204253.61930: variable 'ansible_search_path' from source: unknown 22690 1727204253.61934: variable 'ansible_search_path' from source: unknown 22690 1727204253.61967: calling self._execute() 22690 1727204253.62061: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.62067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.62078: variable 'omit' from source: magic vars 22690 1727204253.62386: variable 'ansible_distribution_major_version' from source: facts 22690 1727204253.62399: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204253.62406: variable 'omit' from source: magic vars 22690 1727204253.62442: variable 'omit' from source: magic vars 22690 1727204253.62469: variable 'omit' from source: magic vars 22690 1727204253.62504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204253.62536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204253.62556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204253.62574: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.62586: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204253.62611: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204253.62614: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.62619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.62708: Set connection var ansible_connection to ssh 22690 1727204253.62720: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204253.62725: Set connection var ansible_pipelining to False 22690 1727204253.62728: Set connection var ansible_shell_type to sh 22690 1727204253.62734: Set connection var ansible_shell_executable to /bin/sh 22690 1727204253.62741: Set connection var ansible_timeout to 10 22690 1727204253.62762: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.62765: variable 'ansible_connection' from source: unknown 22690 1727204253.62770: variable 'ansible_module_compression' from source: unknown 22690 1727204253.62772: variable 'ansible_shell_type' from source: unknown 22690 1727204253.62775: variable 'ansible_shell_executable' from source: unknown 22690 1727204253.62778: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204253.62780: variable 'ansible_pipelining' from source: unknown 22690 1727204253.62783: variable 'ansible_timeout' from source: unknown 22690 1727204253.62787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204253.62950: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204253.62959: variable 'omit' from source: magic vars 22690 1727204253.62964: starting attempt loop 22690 1727204253.62966: running the handler 22690 1727204253.62982: _low_level_execute_command(): starting 22690 1727204253.62989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204253.63552: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.63558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.63561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.63616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204253.63620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.63622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.63702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.65472: stdout chunk (state=3): >>>/root <<< 22690 1727204253.65575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204253.65650: stderr chunk (state=3): >>><<< 22690 1727204253.65652: stdout chunk (state=3): >>><<< 22690 1727204253.65673: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204253.65687: _low_level_execute_command(): starting 22690 1727204253.65694: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633 `" && echo ansible-tmp-1727204253.6567252-23928-191706241860633="` echo /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633 `" ) && sleep 0' 22690 1727204253.66168: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.66177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204253.66202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.66213: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.66216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.66257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.66276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.66354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.68332: stdout chunk (state=3): >>>ansible-tmp-1727204253.6567252-23928-191706241860633=/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633 <<< 22690 1727204253.68491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204253.68507: stderr chunk (state=3): >>><<< 22690 1727204253.68510: stdout chunk (state=3): >>><<< 22690 1727204253.68530: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204253.6567252-23928-191706241860633=/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204253.68576: variable 'ansible_module_compression' from source: unknown 22690 1727204253.68618: ANSIBALLZ: Using lock for ping 22690 1727204253.68621: ANSIBALLZ: Acquiring lock 22690 1727204253.68625: ANSIBALLZ: Lock acquired: 139846652077616 22690 1727204253.68629: ANSIBALLZ: Creating module 22690 1727204253.82657: ANSIBALLZ: Writing module into payload 22690 1727204253.82701: ANSIBALLZ: Writing module 22690 1727204253.82721: ANSIBALLZ: Renaming module 22690 1727204253.82728: ANSIBALLZ: Done creating module 22690 1727204253.82744: variable 'ansible_facts' from source: unknown 22690 1727204253.82788: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py 22690 1727204253.82895: Sending initial data 22690 1727204253.82899: Sent initial data (153 bytes) 22690 1727204253.83363: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.83402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204253.83407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.83410: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.83412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204253.83414: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.83475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204253.83480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.83482: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.83549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.85229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204253.85289: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204253.85357: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp7akx2bhe /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py <<< 22690 1727204253.85361: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py" <<< 22690 1727204253.85423: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp7akx2bhe" to remote "/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py" <<< 22690 1727204253.86062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204253.86140: stderr chunk (state=3): >>><<< 22690 1727204253.86144: stdout chunk (state=3): >>><<< 22690 1727204253.86163: done transferring module to remote 22690 1727204253.86175: _low_level_execute_command(): starting 22690 1727204253.86180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/ /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py && sleep 0' 22690 1727204253.86635: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.86676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204253.86679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204253.86686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.86688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.86691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204253.86693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.86736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204253.86739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.86742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.86819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204253.88974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204253.88980: stdout chunk (state=3): >>><<< 22690 1727204253.88983: stderr chunk (state=3): >>><<< 22690 1727204253.88986: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204253.88989: _low_level_execute_command(): starting 22690 1727204253.88991: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/AnsiballZ_ping.py && sleep 0' 22690 1727204253.89441: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204253.89453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.89464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.89491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204253.89501: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204253.89508: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204253.89517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.89538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204253.89545: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204253.89552: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204253.89560: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204253.89570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204253.89584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204253.89591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204253.89597: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204253.89641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204253.89682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204253.89696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204253.89714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204253.89823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.06021: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22690 1727204254.07353: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204254.07422: stderr chunk (state=3): >>><<< 22690 1727204254.07426: stdout chunk (state=3): >>><<< 22690 1727204254.07437: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204254.07456: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204254.07466: _low_level_execute_command(): starting 22690 1727204254.07472: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204253.6567252-23928-191706241860633/ > /dev/null 2>&1 && sleep 0' 22690 1727204254.07942: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204254.07952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204254.07976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204254.07980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204254.08036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204254.08040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.08117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.10042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204254.10103: stderr chunk (state=3): >>><<< 22690 1727204254.10107: stdout chunk (state=3): >>><<< 22690 1727204254.10122: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204254.10128: handler run complete 22690 1727204254.10140: attempt loop complete, returning result 22690 1727204254.10147: _execute() done 22690 1727204254.10150: dumping result to json 22690 1727204254.10152: done dumping result, returning 22690 1727204254.10160: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-78bb-bf56-00000000002e] 22690 1727204254.10163: sending task result for task 127b8e07-fff9-78bb-bf56-00000000002e 22690 1727204254.10258: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000002e 22690 1727204254.10261: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 22690 1727204254.10322: no more pending results, returning what we have 22690 1727204254.10327: results queue empty 22690 1727204254.10327: checking for any_errors_fatal 22690 1727204254.10334: done checking for any_errors_fatal 22690 1727204254.10335: checking for max_fail_percentage 22690 1727204254.10337: done checking for max_fail_percentage 22690 1727204254.10337: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.10339: done checking to see if all hosts have failed 22690 1727204254.10339: getting the remaining hosts for this loop 22690 1727204254.10341: done getting the remaining hosts for this loop 22690 1727204254.10345: getting the next task for host managed-node2 22690 1727204254.10352: done getting next task for host managed-node2 22690 1727204254.10354: ^ task is: TASK: meta (role_complete) 22690 1727204254.10356: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.10368: getting variables 22690 1727204254.10370: in VariableManager get_vars() 22690 1727204254.10408: Calling all_inventory to load vars for managed-node2 22690 1727204254.10411: Calling groups_inventory to load vars for managed-node2 22690 1727204254.10413: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.10425: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.10428: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.10431: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.11903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.18345: done with get_vars() 22690 1727204254.18387: done getting variables 22690 1727204254.18473: done queuing things up, now waiting for results queue to drain 22690 1727204254.18475: results queue empty 22690 1727204254.18476: checking for any_errors_fatal 22690 1727204254.18479: done checking for any_errors_fatal 22690 1727204254.18480: checking for max_fail_percentage 22690 1727204254.18481: done checking for max_fail_percentage 22690 1727204254.18482: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.18482: done checking to see if all hosts have failed 22690 1727204254.18483: getting the remaining hosts for this loop 22690 1727204254.18484: done getting the remaining hosts for this loop 22690 1727204254.18487: getting the next task for host managed-node2 22690 1727204254.18491: done getting next task for host managed-node2 22690 1727204254.18493: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 22690 1727204254.18495: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.18498: getting variables 22690 1727204254.18499: in VariableManager get_vars() 22690 1727204254.18513: Calling all_inventory to load vars for managed-node2 22690 1727204254.18518: Calling groups_inventory to load vars for managed-node2 22690 1727204254.18520: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.18526: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.18529: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.18532: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.20070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.22274: done with get_vars() 22690 1727204254.22306: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.610) 0:00:21.507 ***** 22690 1727204254.22393: entering _queue_task() for managed-node2/include_tasks 22690 1727204254.22778: worker is 1 (out of 1 available) 22690 1727204254.22793: exiting _queue_task() for managed-node2/include_tasks 22690 1727204254.22808: done queuing things up, now waiting for results queue to drain 22690 1727204254.22810: waiting for pending results... 22690 1727204254.23200: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 22690 1727204254.23273: in run() - task 127b8e07-fff9-78bb-bf56-000000000030 22690 1727204254.23303: variable 'ansible_search_path' from source: unknown 22690 1727204254.23354: calling self._execute() 22690 1727204254.23471: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.23486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.23510: variable 'omit' from source: magic vars 22690 1727204254.24052: variable 'ansible_distribution_major_version' from source: facts 22690 1727204254.24057: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204254.24060: _execute() done 22690 1727204254.24063: dumping result to json 22690 1727204254.24067: done dumping result, returning 22690 1727204254.24071: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [127b8e07-fff9-78bb-bf56-000000000030] 22690 1727204254.24073: sending task result for task 127b8e07-fff9-78bb-bf56-000000000030 22690 1727204254.24159: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000030 22690 1727204254.24163: WORKER PROCESS EXITING 22690 1727204254.24198: no more pending results, returning what we have 22690 1727204254.24204: in VariableManager get_vars() 22690 1727204254.24259: Calling all_inventory to load vars for managed-node2 22690 1727204254.24262: Calling groups_inventory to load vars for managed-node2 22690 1727204254.24267: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.24285: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.24288: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.24292: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.26478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.28676: done with get_vars() 22690 1727204254.28709: variable 'ansible_search_path' from source: unknown 22690 1727204254.28730: we have included files to process 22690 1727204254.28731: generating all_blocks data 22690 1727204254.28734: done generating all_blocks data 22690 1727204254.28739: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22690 1727204254.28741: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22690 1727204254.28743: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22690 1727204254.29103: done processing included file 22690 1727204254.29105: iterating over new_blocks loaded from include file 22690 1727204254.29106: in VariableManager get_vars() 22690 1727204254.29126: done with get_vars() 22690 1727204254.29128: filtering new block on tags 22690 1727204254.29145: done filtering new block on tags 22690 1727204254.29147: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed-node2 22690 1727204254.29153: extending task lists for all hosts with included blocks 22690 1727204254.29188: done extending task lists 22690 1727204254.29189: done processing included files 22690 1727204254.29190: results queue empty 22690 1727204254.29190: checking for any_errors_fatal 22690 1727204254.29193: done checking for any_errors_fatal 22690 1727204254.29194: checking for max_fail_percentage 22690 1727204254.29195: done checking for max_fail_percentage 22690 1727204254.29196: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.29197: done checking to see if all hosts have failed 22690 1727204254.29198: getting the remaining hosts for this loop 22690 1727204254.29200: done getting the remaining hosts for this loop 22690 1727204254.29202: getting the next task for host managed-node2 22690 1727204254.29206: done getting next task for host managed-node2 22690 1727204254.29208: ^ task is: TASK: Assert that warnings is empty 22690 1727204254.29211: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.29213: getting variables 22690 1727204254.29214: in VariableManager get_vars() 22690 1727204254.29228: Calling all_inventory to load vars for managed-node2 22690 1727204254.29230: Calling groups_inventory to load vars for managed-node2 22690 1727204254.29233: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.29238: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.29241: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.29244: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.30811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.33074: done with get_vars() 22690 1727204254.33106: done getting variables 22690 1727204254.33162: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.107) 0:00:21.615 ***** 22690 1727204254.33196: entering _queue_task() for managed-node2/assert 22690 1727204254.33590: worker is 1 (out of 1 available) 22690 1727204254.33605: exiting _queue_task() for managed-node2/assert 22690 1727204254.33620: done queuing things up, now waiting for results queue to drain 22690 1727204254.33622: waiting for pending results... 22690 1727204254.34088: running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty 22690 1727204254.34093: in run() - task 127b8e07-fff9-78bb-bf56-000000000304 22690 1727204254.34097: variable 'ansible_search_path' from source: unknown 22690 1727204254.34099: variable 'ansible_search_path' from source: unknown 22690 1727204254.34133: calling self._execute() 22690 1727204254.34241: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.34253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.34269: variable 'omit' from source: magic vars 22690 1727204254.34693: variable 'ansible_distribution_major_version' from source: facts 22690 1727204254.34713: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204254.34759: variable 'omit' from source: magic vars 22690 1727204254.34783: variable 'omit' from source: magic vars 22690 1727204254.34831: variable 'omit' from source: magic vars 22690 1727204254.34887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204254.34973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204254.34977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204254.34988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.35006: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.35046: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204254.35055: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.35064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.35369: Set connection var ansible_connection to ssh 22690 1727204254.35374: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204254.35377: Set connection var ansible_pipelining to False 22690 1727204254.35379: Set connection var ansible_shell_type to sh 22690 1727204254.35383: Set connection var ansible_shell_executable to /bin/sh 22690 1727204254.35386: Set connection var ansible_timeout to 10 22690 1727204254.35388: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.35390: variable 'ansible_connection' from source: unknown 22690 1727204254.35393: variable 'ansible_module_compression' from source: unknown 22690 1727204254.35396: variable 'ansible_shell_type' from source: unknown 22690 1727204254.35398: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.35400: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.35402: variable 'ansible_pipelining' from source: unknown 22690 1727204254.35404: variable 'ansible_timeout' from source: unknown 22690 1727204254.35406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.35485: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204254.35502: variable 'omit' from source: magic vars 22690 1727204254.35513: starting attempt loop 22690 1727204254.35529: running the handler 22690 1727204254.35707: variable '__network_connections_result' from source: set_fact 22690 1727204254.35730: Evaluated conditional ('warnings' not in __network_connections_result): True 22690 1727204254.35746: handler run complete 22690 1727204254.35770: attempt loop complete, returning result 22690 1727204254.35779: _execute() done 22690 1727204254.35787: dumping result to json 22690 1727204254.35794: done dumping result, returning 22690 1727204254.35807: done running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty [127b8e07-fff9-78bb-bf56-000000000304] 22690 1727204254.35820: sending task result for task 127b8e07-fff9-78bb-bf56-000000000304 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22690 1727204254.36011: no more pending results, returning what we have 22690 1727204254.36017: results queue empty 22690 1727204254.36019: checking for any_errors_fatal 22690 1727204254.36021: done checking for any_errors_fatal 22690 1727204254.36022: checking for max_fail_percentage 22690 1727204254.36024: done checking for max_fail_percentage 22690 1727204254.36025: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.36026: done checking to see if all hosts have failed 22690 1727204254.36027: getting the remaining hosts for this loop 22690 1727204254.36028: done getting the remaining hosts for this loop 22690 1727204254.36033: getting the next task for host managed-node2 22690 1727204254.36040: done getting next task for host managed-node2 22690 1727204254.36043: ^ task is: TASK: Assert that there is output in stderr 22690 1727204254.36047: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.36052: getting variables 22690 1727204254.36054: in VariableManager get_vars() 22690 1727204254.36100: Calling all_inventory to load vars for managed-node2 22690 1727204254.36103: Calling groups_inventory to load vars for managed-node2 22690 1727204254.36106: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.36123: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.36126: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.36130: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.36884: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000304 22690 1727204254.36888: WORKER PROCESS EXITING 22690 1727204254.38209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.40422: done with get_vars() 22690 1727204254.40459: done getting variables 22690 1727204254.40532: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.073) 0:00:21.689 ***** 22690 1727204254.40563: entering _queue_task() for managed-node2/assert 22690 1727204254.40954: worker is 1 (out of 1 available) 22690 1727204254.41171: exiting _queue_task() for managed-node2/assert 22690 1727204254.41184: done queuing things up, now waiting for results queue to drain 22690 1727204254.41185: waiting for pending results... 22690 1727204254.41305: running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr 22690 1727204254.41436: in run() - task 127b8e07-fff9-78bb-bf56-000000000305 22690 1727204254.41463: variable 'ansible_search_path' from source: unknown 22690 1727204254.41474: variable 'ansible_search_path' from source: unknown 22690 1727204254.41523: calling self._execute() 22690 1727204254.41643: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.41658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.41680: variable 'omit' from source: magic vars 22690 1727204254.42110: variable 'ansible_distribution_major_version' from source: facts 22690 1727204254.42134: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204254.42146: variable 'omit' from source: magic vars 22690 1727204254.42201: variable 'omit' from source: magic vars 22690 1727204254.42250: variable 'omit' from source: magic vars 22690 1727204254.42305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204254.42355: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204254.42384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204254.42471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.42474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.42477: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204254.42480: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.42485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.42595: Set connection var ansible_connection to ssh 22690 1727204254.42613: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204254.42635: Set connection var ansible_pipelining to False 22690 1727204254.42642: Set connection var ansible_shell_type to sh 22690 1727204254.42652: Set connection var ansible_shell_executable to /bin/sh 22690 1727204254.42663: Set connection var ansible_timeout to 10 22690 1727204254.42691: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.42698: variable 'ansible_connection' from source: unknown 22690 1727204254.42734: variable 'ansible_module_compression' from source: unknown 22690 1727204254.42737: variable 'ansible_shell_type' from source: unknown 22690 1727204254.42740: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.42742: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.42744: variable 'ansible_pipelining' from source: unknown 22690 1727204254.42746: variable 'ansible_timeout' from source: unknown 22690 1727204254.42749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.42909: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204254.42952: variable 'omit' from source: magic vars 22690 1727204254.42955: starting attempt loop 22690 1727204254.42958: running the handler 22690 1727204254.43102: variable '__network_connections_result' from source: set_fact 22690 1727204254.43124: Evaluated conditional ('stderr' in __network_connections_result): True 22690 1727204254.43170: handler run complete 22690 1727204254.43173: attempt loop complete, returning result 22690 1727204254.43178: _execute() done 22690 1727204254.43180: dumping result to json 22690 1727204254.43183: done dumping result, returning 22690 1727204254.43185: done running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr [127b8e07-fff9-78bb-bf56-000000000305] 22690 1727204254.43196: sending task result for task 127b8e07-fff9-78bb-bf56-000000000305 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22690 1727204254.43460: no more pending results, returning what we have 22690 1727204254.43464: results queue empty 22690 1727204254.43468: checking for any_errors_fatal 22690 1727204254.43476: done checking for any_errors_fatal 22690 1727204254.43477: checking for max_fail_percentage 22690 1727204254.43479: done checking for max_fail_percentage 22690 1727204254.43480: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.43481: done checking to see if all hosts have failed 22690 1727204254.43482: getting the remaining hosts for this loop 22690 1727204254.43483: done getting the remaining hosts for this loop 22690 1727204254.43488: getting the next task for host managed-node2 22690 1727204254.43499: done getting next task for host managed-node2 22690 1727204254.43502: ^ task is: TASK: meta (flush_handlers) 22690 1727204254.43505: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.43510: getting variables 22690 1727204254.43512: in VariableManager get_vars() 22690 1727204254.43556: Calling all_inventory to load vars for managed-node2 22690 1727204254.43560: Calling groups_inventory to load vars for managed-node2 22690 1727204254.43562: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.43571: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000305 22690 1727204254.43574: WORKER PROCESS EXITING 22690 1727204254.43777: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.43781: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.43785: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.45746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.47957: done with get_vars() 22690 1727204254.47996: done getting variables 22690 1727204254.48076: in VariableManager get_vars() 22690 1727204254.48091: Calling all_inventory to load vars for managed-node2 22690 1727204254.48093: Calling groups_inventory to load vars for managed-node2 22690 1727204254.48095: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.48100: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.48103: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.48106: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.49777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.52000: done with get_vars() 22690 1727204254.52047: done queuing things up, now waiting for results queue to drain 22690 1727204254.52050: results queue empty 22690 1727204254.52051: checking for any_errors_fatal 22690 1727204254.52054: done checking for any_errors_fatal 22690 1727204254.52055: checking for max_fail_percentage 22690 1727204254.52056: done checking for max_fail_percentage 22690 1727204254.52057: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.52058: done checking to see if all hosts have failed 22690 1727204254.52058: getting the remaining hosts for this loop 22690 1727204254.52068: done getting the remaining hosts for this loop 22690 1727204254.52072: getting the next task for host managed-node2 22690 1727204254.52076: done getting next task for host managed-node2 22690 1727204254.52078: ^ task is: TASK: meta (flush_handlers) 22690 1727204254.52080: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.52083: getting variables 22690 1727204254.52084: in VariableManager get_vars() 22690 1727204254.52098: Calling all_inventory to load vars for managed-node2 22690 1727204254.52100: Calling groups_inventory to load vars for managed-node2 22690 1727204254.52102: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.52108: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.52111: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.52114: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.53675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.55848: done with get_vars() 22690 1727204254.55884: done getting variables 22690 1727204254.55943: in VariableManager get_vars() 22690 1727204254.55957: Calling all_inventory to load vars for managed-node2 22690 1727204254.55960: Calling groups_inventory to load vars for managed-node2 22690 1727204254.55962: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.55969: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.55972: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.55975: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.57600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.59813: done with get_vars() 22690 1727204254.59863: done queuing things up, now waiting for results queue to drain 22690 1727204254.59867: results queue empty 22690 1727204254.59868: checking for any_errors_fatal 22690 1727204254.59870: done checking for any_errors_fatal 22690 1727204254.59871: checking for max_fail_percentage 22690 1727204254.59872: done checking for max_fail_percentage 22690 1727204254.59873: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.59874: done checking to see if all hosts have failed 22690 1727204254.59875: getting the remaining hosts for this loop 22690 1727204254.59876: done getting the remaining hosts for this loop 22690 1727204254.59879: getting the next task for host managed-node2 22690 1727204254.59883: done getting next task for host managed-node2 22690 1727204254.59884: ^ task is: None 22690 1727204254.59886: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.59887: done queuing things up, now waiting for results queue to drain 22690 1727204254.59888: results queue empty 22690 1727204254.59889: checking for any_errors_fatal 22690 1727204254.59890: done checking for any_errors_fatal 22690 1727204254.59891: checking for max_fail_percentage 22690 1727204254.59892: done checking for max_fail_percentage 22690 1727204254.59893: checking to see if all hosts have failed and the running result is not ok 22690 1727204254.59894: done checking to see if all hosts have failed 22690 1727204254.59895: getting the next task for host managed-node2 22690 1727204254.59898: done getting next task for host managed-node2 22690 1727204254.59899: ^ task is: None 22690 1727204254.59900: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.59956: in VariableManager get_vars() 22690 1727204254.59976: done with get_vars() 22690 1727204254.59982: in VariableManager get_vars() 22690 1727204254.59992: done with get_vars() 22690 1727204254.59996: variable 'omit' from source: magic vars 22690 1727204254.60032: in VariableManager get_vars() 22690 1727204254.60041: done with get_vars() 22690 1727204254.60064: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 22690 1727204254.60260: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204254.60288: getting the remaining hosts for this loop 22690 1727204254.60289: done getting the remaining hosts for this loop 22690 1727204254.60292: getting the next task for host managed-node2 22690 1727204254.60295: done getting next task for host managed-node2 22690 1727204254.60297: ^ task is: TASK: Gathering Facts 22690 1727204254.60299: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204254.60301: getting variables 22690 1727204254.60302: in VariableManager get_vars() 22690 1727204254.60310: Calling all_inventory to load vars for managed-node2 22690 1727204254.60313: Calling groups_inventory to load vars for managed-node2 22690 1727204254.60318: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204254.60325: Calling all_plugins_play to load vars for managed-node2 22690 1727204254.60327: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204254.60330: Calling groups_plugins_play to load vars for managed-node2 22690 1727204254.61979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204254.64182: done with get_vars() 22690 1727204254.64219: done getting variables 22690 1727204254.64274: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.237) 0:00:21.926 ***** 22690 1727204254.64301: entering _queue_task() for managed-node2/gather_facts 22690 1727204254.64672: worker is 1 (out of 1 available) 22690 1727204254.64688: exiting _queue_task() for managed-node2/gather_facts 22690 1727204254.64701: done queuing things up, now waiting for results queue to drain 22690 1727204254.64702: waiting for pending results... 22690 1727204254.64966: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204254.65274: in run() - task 127b8e07-fff9-78bb-bf56-000000000316 22690 1727204254.65278: variable 'ansible_search_path' from source: unknown 22690 1727204254.65281: calling self._execute() 22690 1727204254.65283: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.65287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.65289: variable 'omit' from source: magic vars 22690 1727204254.65690: variable 'ansible_distribution_major_version' from source: facts 22690 1727204254.65708: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204254.65719: variable 'omit' from source: magic vars 22690 1727204254.65755: variable 'omit' from source: magic vars 22690 1727204254.65799: variable 'omit' from source: magic vars 22690 1727204254.65848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204254.65901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204254.65927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204254.65953: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.65977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204254.66014: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204254.66023: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.66031: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.66144: Set connection var ansible_connection to ssh 22690 1727204254.66160: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204254.66175: Set connection var ansible_pipelining to False 22690 1727204254.66187: Set connection var ansible_shell_type to sh 22690 1727204254.66197: Set connection var ansible_shell_executable to /bin/sh 22690 1727204254.66209: Set connection var ansible_timeout to 10 22690 1727204254.66237: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.66270: variable 'ansible_connection' from source: unknown 22690 1727204254.66273: variable 'ansible_module_compression' from source: unknown 22690 1727204254.66276: variable 'ansible_shell_type' from source: unknown 22690 1727204254.66278: variable 'ansible_shell_executable' from source: unknown 22690 1727204254.66280: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204254.66282: variable 'ansible_pipelining' from source: unknown 22690 1727204254.66287: variable 'ansible_timeout' from source: unknown 22690 1727204254.66373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204254.66501: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204254.66521: variable 'omit' from source: magic vars 22690 1727204254.66531: starting attempt loop 22690 1727204254.66538: running the handler 22690 1727204254.66558: variable 'ansible_facts' from source: unknown 22690 1727204254.66602: _low_level_execute_command(): starting 22690 1727204254.66615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204254.67495: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204254.67514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204254.67537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.67642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.69452: stdout chunk (state=3): >>>/root <<< 22690 1727204254.69630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204254.69656: stdout chunk (state=3): >>><<< 22690 1727204254.69660: stderr chunk (state=3): >>><<< 22690 1727204254.69685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204254.69793: _low_level_execute_command(): starting 22690 1727204254.69797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351 `" && echo ansible-tmp-1727204254.6969292-24008-74917762370351="` echo /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351 `" ) && sleep 0' 22690 1727204254.70373: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204254.70390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204254.70484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204254.70520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204254.70536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204254.70558: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.70660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.72645: stdout chunk (state=3): >>>ansible-tmp-1727204254.6969292-24008-74917762370351=/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351 <<< 22690 1727204254.72783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204254.72880: stderr chunk (state=3): >>><<< 22690 1727204254.72890: stdout chunk (state=3): >>><<< 22690 1727204254.73072: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204254.6969292-24008-74917762370351=/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204254.73076: variable 'ansible_module_compression' from source: unknown 22690 1727204254.73079: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204254.73099: variable 'ansible_facts' from source: unknown 22690 1727204254.73270: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py 22690 1727204254.73550: Sending initial data 22690 1727204254.73553: Sent initial data (153 bytes) 22690 1727204254.74160: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204254.74180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204254.74196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204254.74223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204254.74332: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204254.74357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.74469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.76087: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204254.76185: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204254.76284: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp4veg0ag7 /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py <<< 22690 1727204254.76287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py" <<< 22690 1727204254.76378: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp4veg0ag7" to remote "/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py" <<< 22690 1727204254.78268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204254.78281: stdout chunk (state=3): >>><<< 22690 1727204254.78294: stderr chunk (state=3): >>><<< 22690 1727204254.78332: done transferring module to remote 22690 1727204254.78433: _low_level_execute_command(): starting 22690 1727204254.78438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/ /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py && sleep 0' 22690 1727204254.79046: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204254.79061: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204254.79109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204254.79127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204254.79222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204254.79270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.79343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204254.82012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204254.82019: stdout chunk (state=3): >>><<< 22690 1727204254.82022: stderr chunk (state=3): >>><<< 22690 1727204254.82026: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204254.82029: _low_level_execute_command(): starting 22690 1727204254.82032: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/AnsiballZ_setup.py && sleep 0' 22690 1727204254.82635: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204254.82651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204254.82668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204254.82689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204254.82720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204254.82813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204254.82832: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204254.82854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204254.82972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204255.48825: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "35", "epoch": "1727204255", "epoch_int": "1727204255", "date": "2024-09-24", "time": "14:57:35", "iso8601_micro": "2024-09-24T18:57:35.123085Z", "iso8601": "2024-09-24T18:57:35Z", "iso8601_basic": "20240924T145735123085", "iso8601_basic_short": "20240924T145735", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fips": false, "ansible_interfaces": ["lsr27", "peerlsr27", "eth0", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3054, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 662, "free": 3054}, "nocache": {"free": 3486, "used": 230}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 601, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316633600, "block_size": 4096, "block_total": 64479564, "block_available": 61356600, "block_used": 3122964, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.513671875, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204255.50777: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204255.50889: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204255.50944: stderr chunk (state=3): >>><<< 22690 1727204255.50954: stdout chunk (state=3): >>><<< 22690 1727204255.51023: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "35", "epoch": "1727204255", "epoch_int": "1727204255", "date": "2024-09-24", "time": "14:57:35", "iso8601_micro": "2024-09-24T18:57:35.123085Z", "iso8601": "2024-09-24T18:57:35Z", "iso8601_basic": "20240924T145735123085", "iso8601_basic_short": "20240924T145735", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_fips": false, "ansible_interfaces": ["lsr27", "peerlsr27", "eth0", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3054, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 662, "free": 3054}, "nocache": {"free": 3486, "used": 230}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 601, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316633600, "block_size": 4096, "block_total": 64479564, "block_available": 61356600, "block_used": 3122964, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.513671875, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204255.51649: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204255.51882: _low_level_execute_command(): starting 22690 1727204255.51887: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204254.6969292-24008-74917762370351/ > /dev/null 2>&1 && sleep 0' 22690 1727204255.53028: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204255.53060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204255.53285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204255.53508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204255.53552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204255.53723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204255.53828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204255.55832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204255.55837: stderr chunk (state=3): >>><<< 22690 1727204255.55846: stdout chunk (state=3): >>><<< 22690 1727204255.55873: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204255.55886: handler run complete 22690 1727204255.56043: variable 'ansible_facts' from source: unknown 22690 1727204255.56160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.56531: variable 'ansible_facts' from source: unknown 22690 1727204255.56635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.56787: attempt loop complete, returning result 22690 1727204255.56797: _execute() done 22690 1727204255.56806: dumping result to json 22690 1727204255.56845: done dumping result, returning 22690 1727204255.56860: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-000000000316] 22690 1727204255.56874: sending task result for task 127b8e07-fff9-78bb-bf56-000000000316 ok: [managed-node2] 22690 1727204255.57825: no more pending results, returning what we have 22690 1727204255.57829: results queue empty 22690 1727204255.57830: checking for any_errors_fatal 22690 1727204255.57831: done checking for any_errors_fatal 22690 1727204255.57832: checking for max_fail_percentage 22690 1727204255.57834: done checking for max_fail_percentage 22690 1727204255.57834: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.57836: done checking to see if all hosts have failed 22690 1727204255.57836: getting the remaining hosts for this loop 22690 1727204255.57838: done getting the remaining hosts for this loop 22690 1727204255.57841: getting the next task for host managed-node2 22690 1727204255.57846: done getting next task for host managed-node2 22690 1727204255.57848: ^ task is: TASK: meta (flush_handlers) 22690 1727204255.57850: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.57854: getting variables 22690 1727204255.57855: in VariableManager get_vars() 22690 1727204255.57881: Calling all_inventory to load vars for managed-node2 22690 1727204255.57884: Calling groups_inventory to load vars for managed-node2 22690 1727204255.57888: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.57900: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.57903: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.57907: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.58559: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000316 22690 1727204255.58563: WORKER PROCESS EXITING 22690 1727204255.59735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.62022: done with get_vars() 22690 1727204255.62049: done getting variables 22690 1727204255.62121: in VariableManager get_vars() 22690 1727204255.62132: Calling all_inventory to load vars for managed-node2 22690 1727204255.62135: Calling groups_inventory to load vars for managed-node2 22690 1727204255.62138: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.62143: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.62146: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.62149: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.63552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.65612: done with get_vars() 22690 1727204255.65657: done queuing things up, now waiting for results queue to drain 22690 1727204255.65660: results queue empty 22690 1727204255.65661: checking for any_errors_fatal 22690 1727204255.65667: done checking for any_errors_fatal 22690 1727204255.65668: checking for max_fail_percentage 22690 1727204255.65669: done checking for max_fail_percentage 22690 1727204255.65675: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.65676: done checking to see if all hosts have failed 22690 1727204255.65677: getting the remaining hosts for this loop 22690 1727204255.65678: done getting the remaining hosts for this loop 22690 1727204255.65681: getting the next task for host managed-node2 22690 1727204255.65686: done getting next task for host managed-node2 22690 1727204255.65688: ^ task is: TASK: Show network_provider 22690 1727204255.65690: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.65693: getting variables 22690 1727204255.65694: in VariableManager get_vars() 22690 1727204255.65705: Calling all_inventory to load vars for managed-node2 22690 1727204255.65707: Calling groups_inventory to load vars for managed-node2 22690 1727204255.65710: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.65716: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.65718: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.65721: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.67252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.69380: done with get_vars() 22690 1727204255.69419: done getting variables 22690 1727204255.69486: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Tuesday 24 September 2024 14:57:35 -0400 (0:00:01.052) 0:00:22.978 ***** 22690 1727204255.69521: entering _queue_task() for managed-node2/debug 22690 1727204255.69929: worker is 1 (out of 1 available) 22690 1727204255.69944: exiting _queue_task() for managed-node2/debug 22690 1727204255.69960: done queuing things up, now waiting for results queue to drain 22690 1727204255.69961: waiting for pending results... 22690 1727204255.70309: running TaskExecutor() for managed-node2/TASK: Show network_provider 22690 1727204255.70444: in run() - task 127b8e07-fff9-78bb-bf56-000000000033 22690 1727204255.70477: variable 'ansible_search_path' from source: unknown 22690 1727204255.70530: calling self._execute() 22690 1727204255.70644: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204255.70659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204255.70682: variable 'omit' from source: magic vars 22690 1727204255.71157: variable 'ansible_distribution_major_version' from source: facts 22690 1727204255.71180: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204255.71192: variable 'omit' from source: magic vars 22690 1727204255.71232: variable 'omit' from source: magic vars 22690 1727204255.71289: variable 'omit' from source: magic vars 22690 1727204255.71340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204255.71391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204255.71419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204255.71444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204255.71463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204255.71510: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204255.71519: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204255.71527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204255.71642: Set connection var ansible_connection to ssh 22690 1727204255.71662: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204255.71681: Set connection var ansible_pipelining to False 22690 1727204255.71772: Set connection var ansible_shell_type to sh 22690 1727204255.71775: Set connection var ansible_shell_executable to /bin/sh 22690 1727204255.71779: Set connection var ansible_timeout to 10 22690 1727204255.71782: variable 'ansible_shell_executable' from source: unknown 22690 1727204255.71784: variable 'ansible_connection' from source: unknown 22690 1727204255.71787: variable 'ansible_module_compression' from source: unknown 22690 1727204255.71789: variable 'ansible_shell_type' from source: unknown 22690 1727204255.71791: variable 'ansible_shell_executable' from source: unknown 22690 1727204255.71795: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204255.71798: variable 'ansible_pipelining' from source: unknown 22690 1727204255.71800: variable 'ansible_timeout' from source: unknown 22690 1727204255.71802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204255.71956: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204255.71976: variable 'omit' from source: magic vars 22690 1727204255.72026: starting attempt loop 22690 1727204255.72030: running the handler 22690 1727204255.72053: variable 'network_provider' from source: set_fact 22690 1727204255.72158: variable 'network_provider' from source: set_fact 22690 1727204255.72179: handler run complete 22690 1727204255.72203: attempt loop complete, returning result 22690 1727204255.72211: _execute() done 22690 1727204255.72244: dumping result to json 22690 1727204255.72247: done dumping result, returning 22690 1727204255.72251: done running TaskExecutor() for managed-node2/TASK: Show network_provider [127b8e07-fff9-78bb-bf56-000000000033] 22690 1727204255.72254: sending task result for task 127b8e07-fff9-78bb-bf56-000000000033 ok: [managed-node2] => { "network_provider": "nm" } 22690 1727204255.72419: no more pending results, returning what we have 22690 1727204255.72423: results queue empty 22690 1727204255.72424: checking for any_errors_fatal 22690 1727204255.72427: done checking for any_errors_fatal 22690 1727204255.72428: checking for max_fail_percentage 22690 1727204255.72430: done checking for max_fail_percentage 22690 1727204255.72431: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.72432: done checking to see if all hosts have failed 22690 1727204255.72433: getting the remaining hosts for this loop 22690 1727204255.72434: done getting the remaining hosts for this loop 22690 1727204255.72439: getting the next task for host managed-node2 22690 1727204255.72449: done getting next task for host managed-node2 22690 1727204255.72451: ^ task is: TASK: meta (flush_handlers) 22690 1727204255.72454: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.72460: getting variables 22690 1727204255.72462: in VariableManager get_vars() 22690 1727204255.72601: Calling all_inventory to load vars for managed-node2 22690 1727204255.72605: Calling groups_inventory to load vars for managed-node2 22690 1727204255.72610: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.72624: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.72630: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.72634: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.73385: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000033 22690 1727204255.73389: WORKER PROCESS EXITING 22690 1727204255.74636: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.76901: done with get_vars() 22690 1727204255.76932: done getting variables 22690 1727204255.77018: in VariableManager get_vars() 22690 1727204255.77029: Calling all_inventory to load vars for managed-node2 22690 1727204255.77032: Calling groups_inventory to load vars for managed-node2 22690 1727204255.77035: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.77040: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.77043: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.77046: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.78578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.80888: done with get_vars() 22690 1727204255.80933: done queuing things up, now waiting for results queue to drain 22690 1727204255.80936: results queue empty 22690 1727204255.80937: checking for any_errors_fatal 22690 1727204255.80940: done checking for any_errors_fatal 22690 1727204255.80940: checking for max_fail_percentage 22690 1727204255.80942: done checking for max_fail_percentage 22690 1727204255.80942: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.80943: done checking to see if all hosts have failed 22690 1727204255.80944: getting the remaining hosts for this loop 22690 1727204255.80945: done getting the remaining hosts for this loop 22690 1727204255.80948: getting the next task for host managed-node2 22690 1727204255.80959: done getting next task for host managed-node2 22690 1727204255.80961: ^ task is: TASK: meta (flush_handlers) 22690 1727204255.80963: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.80967: getting variables 22690 1727204255.80969: in VariableManager get_vars() 22690 1727204255.80979: Calling all_inventory to load vars for managed-node2 22690 1727204255.80982: Calling groups_inventory to load vars for managed-node2 22690 1727204255.80985: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.80991: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.80994: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.80997: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.83346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.86546: done with get_vars() 22690 1727204255.86626: done getting variables 22690 1727204255.86690: in VariableManager get_vars() 22690 1727204255.86701: Calling all_inventory to load vars for managed-node2 22690 1727204255.86704: Calling groups_inventory to load vars for managed-node2 22690 1727204255.86707: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.86712: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.86718: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.86722: Calling groups_plugins_play to load vars for managed-node2 22690 1727204255.89214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204255.97510: done with get_vars() 22690 1727204255.97549: done queuing things up, now waiting for results queue to drain 22690 1727204255.97552: results queue empty 22690 1727204255.97553: checking for any_errors_fatal 22690 1727204255.97554: done checking for any_errors_fatal 22690 1727204255.97555: checking for max_fail_percentage 22690 1727204255.97556: done checking for max_fail_percentage 22690 1727204255.97557: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.97558: done checking to see if all hosts have failed 22690 1727204255.97558: getting the remaining hosts for this loop 22690 1727204255.97562: done getting the remaining hosts for this loop 22690 1727204255.97566: getting the next task for host managed-node2 22690 1727204255.97574: done getting next task for host managed-node2 22690 1727204255.97575: ^ task is: None 22690 1727204255.97577: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.97578: done queuing things up, now waiting for results queue to drain 22690 1727204255.97579: results queue empty 22690 1727204255.97580: checking for any_errors_fatal 22690 1727204255.97581: done checking for any_errors_fatal 22690 1727204255.97582: checking for max_fail_percentage 22690 1727204255.97583: done checking for max_fail_percentage 22690 1727204255.97583: checking to see if all hosts have failed and the running result is not ok 22690 1727204255.97584: done checking to see if all hosts have failed 22690 1727204255.97586: getting the next task for host managed-node2 22690 1727204255.97588: done getting next task for host managed-node2 22690 1727204255.97589: ^ task is: None 22690 1727204255.97590: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.97623: in VariableManager get_vars() 22690 1727204255.97646: done with get_vars() 22690 1727204255.97654: in VariableManager get_vars() 22690 1727204255.97672: done with get_vars() 22690 1727204255.97677: variable 'omit' from source: magic vars 22690 1727204255.97801: variable 'profile' from source: play vars 22690 1727204255.97915: in VariableManager get_vars() 22690 1727204255.97931: done with get_vars() 22690 1727204255.97953: variable 'omit' from source: magic vars 22690 1727204255.98028: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 22690 1727204255.98902: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204255.98927: getting the remaining hosts for this loop 22690 1727204255.98929: done getting the remaining hosts for this loop 22690 1727204255.98931: getting the next task for host managed-node2 22690 1727204255.98934: done getting next task for host managed-node2 22690 1727204255.98936: ^ task is: TASK: Gathering Facts 22690 1727204255.98938: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204255.98940: getting variables 22690 1727204255.98941: in VariableManager get_vars() 22690 1727204255.98954: Calling all_inventory to load vars for managed-node2 22690 1727204255.98956: Calling groups_inventory to load vars for managed-node2 22690 1727204255.98959: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204255.98967: Calling all_plugins_play to load vars for managed-node2 22690 1727204255.98970: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204255.98973: Calling groups_plugins_play to load vars for managed-node2 22690 1727204256.01836: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204256.04650: done with get_vars() 22690 1727204256.04689: done getting variables 22690 1727204256.04740: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.352) 0:00:23.331 ***** 22690 1727204256.04770: entering _queue_task() for managed-node2/gather_facts 22690 1727204256.05139: worker is 1 (out of 1 available) 22690 1727204256.05151: exiting _queue_task() for managed-node2/gather_facts 22690 1727204256.05167: done queuing things up, now waiting for results queue to drain 22690 1727204256.05169: waiting for pending results... 22690 1727204256.05409: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204256.05529: in run() - task 127b8e07-fff9-78bb-bf56-00000000032b 22690 1727204256.05554: variable 'ansible_search_path' from source: unknown 22690 1727204256.05606: calling self._execute() 22690 1727204256.05722: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204256.05743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204256.05851: variable 'omit' from source: magic vars 22690 1727204256.06197: variable 'ansible_distribution_major_version' from source: facts 22690 1727204256.06219: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204256.06232: variable 'omit' from source: magic vars 22690 1727204256.06267: variable 'omit' from source: magic vars 22690 1727204256.06318: variable 'omit' from source: magic vars 22690 1727204256.06378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204256.06430: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204256.06459: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204256.06489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204256.06573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204256.06576: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204256.06579: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204256.06582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204256.06691: Set connection var ansible_connection to ssh 22690 1727204256.06707: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204256.06728: Set connection var ansible_pipelining to False 22690 1727204256.06735: Set connection var ansible_shell_type to sh 22690 1727204256.06745: Set connection var ansible_shell_executable to /bin/sh 22690 1727204256.06758: Set connection var ansible_timeout to 10 22690 1727204256.06792: variable 'ansible_shell_executable' from source: unknown 22690 1727204256.06802: variable 'ansible_connection' from source: unknown 22690 1727204256.06809: variable 'ansible_module_compression' from source: unknown 22690 1727204256.06818: variable 'ansible_shell_type' from source: unknown 22690 1727204256.06900: variable 'ansible_shell_executable' from source: unknown 22690 1727204256.06903: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204256.06905: variable 'ansible_pipelining' from source: unknown 22690 1727204256.06907: variable 'ansible_timeout' from source: unknown 22690 1727204256.06909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204256.07083: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204256.07103: variable 'omit' from source: magic vars 22690 1727204256.07119: starting attempt loop 22690 1727204256.07127: running the handler 22690 1727204256.07148: variable 'ansible_facts' from source: unknown 22690 1727204256.07181: _low_level_execute_command(): starting 22690 1727204256.07193: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204256.08125: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204256.08213: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204256.08258: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.08345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.10124: stdout chunk (state=3): >>>/root <<< 22690 1727204256.10341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204256.10345: stdout chunk (state=3): >>><<< 22690 1727204256.10349: stderr chunk (state=3): >>><<< 22690 1727204256.10488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204256.10492: _low_level_execute_command(): starting 22690 1727204256.10495: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200 `" && echo ansible-tmp-1727204256.103789-24254-159062492182200="` echo /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200 `" ) && sleep 0' 22690 1727204256.11126: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204256.11143: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204256.11164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204256.11191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204256.11282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204256.11326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204256.11356: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.11470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.13576: stdout chunk (state=3): >>>ansible-tmp-1727204256.103789-24254-159062492182200=/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200 <<< 22690 1727204256.13798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204256.13802: stdout chunk (state=3): >>><<< 22690 1727204256.13805: stderr chunk (state=3): >>><<< 22690 1727204256.13826: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204256.103789-24254-159062492182200=/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204256.13873: variable 'ansible_module_compression' from source: unknown 22690 1727204256.13971: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204256.14040: variable 'ansible_facts' from source: unknown 22690 1727204256.14228: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py 22690 1727204256.14502: Sending initial data 22690 1727204256.14506: Sent initial data (153 bytes) 22690 1727204256.15274: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204256.15308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.15418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.17233: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204256.17453: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204256.17550: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwynx0gbk /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py <<< 22690 1727204256.17554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py" <<< 22690 1727204256.17799: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpwynx0gbk" to remote "/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py" <<< 22690 1727204256.20712: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204256.21055: stderr chunk (state=3): >>><<< 22690 1727204256.21059: stdout chunk (state=3): >>><<< 22690 1727204256.21062: done transferring module to remote 22690 1727204256.21064: _low_level_execute_command(): starting 22690 1727204256.21069: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/ /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py && sleep 0' 22690 1727204256.22503: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204256.22607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204256.22721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204256.22859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.23005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.24867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204256.24956: stderr chunk (state=3): >>><<< 22690 1727204256.24977: stdout chunk (state=3): >>><<< 22690 1727204256.25021: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204256.25571: _low_level_execute_command(): starting 22690 1727204256.25575: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/AnsiballZ_setup.py && sleep 0' 22690 1727204256.26550: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204256.26556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204256.26559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204256.26561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204256.26563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204256.26813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204256.26828: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.27020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.94381: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.513671875, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "36", "epoch": "1727204256", "epoch_int": "1727204256", "date": "2024-09-24", "time": "14:57:36", "iso8601_micro": "2024-09-24T18:57:36.581243Z", "iso8601": "2024-09-24T18:57:36Z", "iso8601_basic": "20240924T145736581243", "iso8601_basic_short": "20240924T145736", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo", "lsr27", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3048, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 668, "free": 3048}, "nocache": {"free": 3480, "used": 236}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_fa<<< 22690 1727204256.94425: stdout chunk (state=3): >>>ctor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 603, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316625408, "block_size": 4096, "block_total": 64479564, "block_available": 61356598, "block_used": 3122966, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204256.96682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204256.96686: stdout chunk (state=3): >>><<< 22690 1727204256.96688: stderr chunk (state=3): >>><<< 22690 1727204256.96692: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_is_chroot": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.54150390625, "5m": 0.513671875, "15m": 0.28125}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "36", "epoch": "1727204256", "epoch_int": "1727204256", "date": "2024-09-24", "time": "14:57:36", "iso8601_micro": "2024-09-24T18:57:36.581243Z", "iso8601": "2024-09-24T18:57:36Z", "iso8601_basic": "20240924T145736581243", "iso8601_basic_short": "20240924T145736", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo", "lsr27", "peerlsr27"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3048, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 668, "free": 3048}, "nocache": {"free": 3480, "used": 236}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 603, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316625408, "block_size": 4096, "block_total": 64479564, "block_available": 61356598, "block_used": 3122966, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204256.97046: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204256.97056: _low_level_execute_command(): starting 22690 1727204256.97061: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204256.103789-24254-159062492182200/ > /dev/null 2>&1 && sleep 0' 22690 1727204256.97557: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204256.97561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204256.97564: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204256.97568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204256.97626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204256.97629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204256.97635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204256.97728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204256.99740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204256.99774: stderr chunk (state=3): >>><<< 22690 1727204256.99785: stdout chunk (state=3): >>><<< 22690 1727204256.99789: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204256.99813: handler run complete 22690 1727204256.99984: variable 'ansible_facts' from source: unknown 22690 1727204257.00119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.00473: variable 'ansible_facts' from source: unknown 22690 1727204257.00537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.00623: attempt loop complete, returning result 22690 1727204257.00626: _execute() done 22690 1727204257.00630: dumping result to json 22690 1727204257.00674: done dumping result, returning 22690 1727204257.00679: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-00000000032b] 22690 1727204257.00681: sending task result for task 127b8e07-fff9-78bb-bf56-00000000032b ok: [managed-node2] 22690 1727204257.01590: no more pending results, returning what we have 22690 1727204257.01594: results queue empty 22690 1727204257.01595: checking for any_errors_fatal 22690 1727204257.01597: done checking for any_errors_fatal 22690 1727204257.01598: checking for max_fail_percentage 22690 1727204257.01600: done checking for max_fail_percentage 22690 1727204257.01601: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.01602: done checking to see if all hosts have failed 22690 1727204257.01602: getting the remaining hosts for this loop 22690 1727204257.01604: done getting the remaining hosts for this loop 22690 1727204257.01610: getting the next task for host managed-node2 22690 1727204257.01615: done getting next task for host managed-node2 22690 1727204257.01617: ^ task is: TASK: meta (flush_handlers) 22690 1727204257.01619: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.01623: getting variables 22690 1727204257.01624: in VariableManager get_vars() 22690 1727204257.01653: Calling all_inventory to load vars for managed-node2 22690 1727204257.01656: Calling groups_inventory to load vars for managed-node2 22690 1727204257.01658: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.01693: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000032b 22690 1727204257.01697: WORKER PROCESS EXITING 22690 1727204257.01708: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.01711: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.01715: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.04228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.05930: done with get_vars() 22690 1727204257.05968: done getting variables 22690 1727204257.06034: in VariableManager get_vars() 22690 1727204257.06045: Calling all_inventory to load vars for managed-node2 22690 1727204257.06047: Calling groups_inventory to load vars for managed-node2 22690 1727204257.06051: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.06057: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.06060: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.06063: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.08147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.10348: done with get_vars() 22690 1727204257.10394: done queuing things up, now waiting for results queue to drain 22690 1727204257.10397: results queue empty 22690 1727204257.10399: checking for any_errors_fatal 22690 1727204257.10404: done checking for any_errors_fatal 22690 1727204257.10405: checking for max_fail_percentage 22690 1727204257.10406: done checking for max_fail_percentage 22690 1727204257.10412: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.10413: done checking to see if all hosts have failed 22690 1727204257.10414: getting the remaining hosts for this loop 22690 1727204257.10415: done getting the remaining hosts for this loop 22690 1727204257.10421: getting the next task for host managed-node2 22690 1727204257.10425: done getting next task for host managed-node2 22690 1727204257.10429: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204257.10431: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.10442: getting variables 22690 1727204257.10443: in VariableManager get_vars() 22690 1727204257.10461: Calling all_inventory to load vars for managed-node2 22690 1727204257.10464: Calling groups_inventory to load vars for managed-node2 22690 1727204257.10468: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.10474: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.10477: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.10480: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.12203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.14303: done with get_vars() 22690 1727204257.14329: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:37 -0400 (0:00:01.096) 0:00:24.427 ***** 22690 1727204257.14396: entering _queue_task() for managed-node2/include_tasks 22690 1727204257.14683: worker is 1 (out of 1 available) 22690 1727204257.14699: exiting _queue_task() for managed-node2/include_tasks 22690 1727204257.14714: done queuing things up, now waiting for results queue to drain 22690 1727204257.14716: waiting for pending results... 22690 1727204257.14918: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204257.15006: in run() - task 127b8e07-fff9-78bb-bf56-00000000003c 22690 1727204257.15024: variable 'ansible_search_path' from source: unknown 22690 1727204257.15028: variable 'ansible_search_path' from source: unknown 22690 1727204257.15064: calling self._execute() 22690 1727204257.15146: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.15155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.15171: variable 'omit' from source: magic vars 22690 1727204257.15473: variable 'ansible_distribution_major_version' from source: facts 22690 1727204257.15484: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204257.15489: _execute() done 22690 1727204257.15498: dumping result to json 22690 1727204257.15501: done dumping result, returning 22690 1727204257.15505: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-78bb-bf56-00000000003c] 22690 1727204257.15512: sending task result for task 127b8e07-fff9-78bb-bf56-00000000003c 22690 1727204257.15620: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000003c 22690 1727204257.15623: WORKER PROCESS EXITING 22690 1727204257.15668: no more pending results, returning what we have 22690 1727204257.15679: in VariableManager get_vars() 22690 1727204257.15730: Calling all_inventory to load vars for managed-node2 22690 1727204257.15735: Calling groups_inventory to load vars for managed-node2 22690 1727204257.15737: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.15751: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.15754: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.15757: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.17388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.18552: done with get_vars() 22690 1727204257.18576: variable 'ansible_search_path' from source: unknown 22690 1727204257.18577: variable 'ansible_search_path' from source: unknown 22690 1727204257.18606: we have included files to process 22690 1727204257.18607: generating all_blocks data 22690 1727204257.18608: done generating all_blocks data 22690 1727204257.18609: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204257.18610: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204257.18611: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204257.19033: done processing included file 22690 1727204257.19035: iterating over new_blocks loaded from include file 22690 1727204257.19036: in VariableManager get_vars() 22690 1727204257.19053: done with get_vars() 22690 1727204257.19054: filtering new block on tags 22690 1727204257.19067: done filtering new block on tags 22690 1727204257.19069: in VariableManager get_vars() 22690 1727204257.19082: done with get_vars() 22690 1727204257.19083: filtering new block on tags 22690 1727204257.19096: done filtering new block on tags 22690 1727204257.19097: in VariableManager get_vars() 22690 1727204257.19109: done with get_vars() 22690 1727204257.19111: filtering new block on tags 22690 1727204257.19121: done filtering new block on tags 22690 1727204257.19123: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22690 1727204257.19127: extending task lists for all hosts with included blocks 22690 1727204257.19430: done extending task lists 22690 1727204257.19431: done processing included files 22690 1727204257.19432: results queue empty 22690 1727204257.19433: checking for any_errors_fatal 22690 1727204257.19434: done checking for any_errors_fatal 22690 1727204257.19435: checking for max_fail_percentage 22690 1727204257.19436: done checking for max_fail_percentage 22690 1727204257.19437: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.19438: done checking to see if all hosts have failed 22690 1727204257.19439: getting the remaining hosts for this loop 22690 1727204257.19440: done getting the remaining hosts for this loop 22690 1727204257.19442: getting the next task for host managed-node2 22690 1727204257.19446: done getting next task for host managed-node2 22690 1727204257.19448: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204257.19451: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.19459: getting variables 22690 1727204257.19460: in VariableManager get_vars() 22690 1727204257.19477: Calling all_inventory to load vars for managed-node2 22690 1727204257.19479: Calling groups_inventory to load vars for managed-node2 22690 1727204257.19481: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.19487: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.19489: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.19492: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.21065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.23207: done with get_vars() 22690 1727204257.23248: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.089) 0:00:24.516 ***** 22690 1727204257.23338: entering _queue_task() for managed-node2/setup 22690 1727204257.23823: worker is 1 (out of 1 available) 22690 1727204257.23838: exiting _queue_task() for managed-node2/setup 22690 1727204257.23852: done queuing things up, now waiting for results queue to drain 22690 1727204257.23854: waiting for pending results... 22690 1727204257.24560: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204257.24910: in run() - task 127b8e07-fff9-78bb-bf56-00000000036c 22690 1727204257.24933: variable 'ansible_search_path' from source: unknown 22690 1727204257.24937: variable 'ansible_search_path' from source: unknown 22690 1727204257.24988: calling self._execute() 22690 1727204257.25227: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.25235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.25247: variable 'omit' from source: magic vars 22690 1727204257.26053: variable 'ansible_distribution_major_version' from source: facts 22690 1727204257.26179: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204257.26731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204257.30942: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204257.31030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204257.31080: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204257.31122: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204257.31155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204257.31255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204257.31301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204257.31334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204257.31387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204257.31413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204257.31478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204257.31513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204257.31544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204257.31594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204257.31618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204257.31800: variable '__network_required_facts' from source: role '' defaults 22690 1727204257.31831: variable 'ansible_facts' from source: unknown 22690 1727204257.33141: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22690 1727204257.33155: when evaluation is False, skipping this task 22690 1727204257.33162: _execute() done 22690 1727204257.33172: dumping result to json 22690 1727204257.33180: done dumping result, returning 22690 1727204257.33244: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-78bb-bf56-00000000036c] 22690 1727204257.33247: sending task result for task 127b8e07-fff9-78bb-bf56-00000000036c 22690 1727204257.33324: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000036c 22690 1727204257.33327: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204257.33400: no more pending results, returning what we have 22690 1727204257.33405: results queue empty 22690 1727204257.33406: checking for any_errors_fatal 22690 1727204257.33408: done checking for any_errors_fatal 22690 1727204257.33409: checking for max_fail_percentage 22690 1727204257.33411: done checking for max_fail_percentage 22690 1727204257.33411: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.33412: done checking to see if all hosts have failed 22690 1727204257.33413: getting the remaining hosts for this loop 22690 1727204257.33415: done getting the remaining hosts for this loop 22690 1727204257.33420: getting the next task for host managed-node2 22690 1727204257.33431: done getting next task for host managed-node2 22690 1727204257.33435: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204257.33439: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.33456: getting variables 22690 1727204257.33457: in VariableManager get_vars() 22690 1727204257.33516: Calling all_inventory to load vars for managed-node2 22690 1727204257.33519: Calling groups_inventory to load vars for managed-node2 22690 1727204257.33521: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.33535: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.33538: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.33541: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.35798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.38479: done with get_vars() 22690 1727204257.38509: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.152) 0:00:24.669 ***** 22690 1727204257.38626: entering _queue_task() for managed-node2/stat 22690 1727204257.39188: worker is 1 (out of 1 available) 22690 1727204257.39204: exiting _queue_task() for managed-node2/stat 22690 1727204257.39222: done queuing things up, now waiting for results queue to drain 22690 1727204257.39223: waiting for pending results... 22690 1727204257.39579: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204257.39636: in run() - task 127b8e07-fff9-78bb-bf56-00000000036e 22690 1727204257.39660: variable 'ansible_search_path' from source: unknown 22690 1727204257.39670: variable 'ansible_search_path' from source: unknown 22690 1727204257.39726: calling self._execute() 22690 1727204257.39973: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.39977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.39979: variable 'omit' from source: magic vars 22690 1727204257.40291: variable 'ansible_distribution_major_version' from source: facts 22690 1727204257.40316: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204257.40511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204257.40826: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204257.40891: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204257.40934: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204257.40987: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204257.41096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204257.41123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204257.41148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204257.41170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204257.41344: variable '__network_is_ostree' from source: set_fact 22690 1727204257.41348: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204257.41350: when evaluation is False, skipping this task 22690 1727204257.41352: _execute() done 22690 1727204257.41354: dumping result to json 22690 1727204257.41357: done dumping result, returning 22690 1727204257.41359: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-78bb-bf56-00000000036e] 22690 1727204257.41361: sending task result for task 127b8e07-fff9-78bb-bf56-00000000036e 22690 1727204257.41645: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000036e 22690 1727204257.41648: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204257.41710: no more pending results, returning what we have 22690 1727204257.41714: results queue empty 22690 1727204257.41717: checking for any_errors_fatal 22690 1727204257.41725: done checking for any_errors_fatal 22690 1727204257.41726: checking for max_fail_percentage 22690 1727204257.41728: done checking for max_fail_percentage 22690 1727204257.41729: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.41730: done checking to see if all hosts have failed 22690 1727204257.41731: getting the remaining hosts for this loop 22690 1727204257.41732: done getting the remaining hosts for this loop 22690 1727204257.41737: getting the next task for host managed-node2 22690 1727204257.41742: done getting next task for host managed-node2 22690 1727204257.41747: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204257.41750: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.41772: getting variables 22690 1727204257.41774: in VariableManager get_vars() 22690 1727204257.41822: Calling all_inventory to load vars for managed-node2 22690 1727204257.41825: Calling groups_inventory to load vars for managed-node2 22690 1727204257.41828: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.41840: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.41844: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.41847: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.45031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.47536: done with get_vars() 22690 1727204257.47577: done getting variables 22690 1727204257.47657: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.090) 0:00:24.760 ***** 22690 1727204257.47697: entering _queue_task() for managed-node2/set_fact 22690 1727204257.48305: worker is 1 (out of 1 available) 22690 1727204257.48318: exiting _queue_task() for managed-node2/set_fact 22690 1727204257.48332: done queuing things up, now waiting for results queue to drain 22690 1727204257.48334: waiting for pending results... 22690 1727204257.48579: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204257.48655: in run() - task 127b8e07-fff9-78bb-bf56-00000000036f 22690 1727204257.48729: variable 'ansible_search_path' from source: unknown 22690 1727204257.48738: variable 'ansible_search_path' from source: unknown 22690 1727204257.48802: calling self._execute() 22690 1727204257.49004: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.49008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.49011: variable 'omit' from source: magic vars 22690 1727204257.49423: variable 'ansible_distribution_major_version' from source: facts 22690 1727204257.49453: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204257.49660: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204257.49961: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204257.50023: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204257.50069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204257.50124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204257.50271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204257.50274: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204257.50295: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204257.50339: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204257.50467: variable '__network_is_ostree' from source: set_fact 22690 1727204257.50481: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204257.50488: when evaluation is False, skipping this task 22690 1727204257.50525: _execute() done 22690 1727204257.50528: dumping result to json 22690 1727204257.50535: done dumping result, returning 22690 1727204257.50537: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-78bb-bf56-00000000036f] 22690 1727204257.50540: sending task result for task 127b8e07-fff9-78bb-bf56-00000000036f 22690 1727204257.50812: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000036f 22690 1727204257.50817: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204257.50898: no more pending results, returning what we have 22690 1727204257.50903: results queue empty 22690 1727204257.50903: checking for any_errors_fatal 22690 1727204257.50912: done checking for any_errors_fatal 22690 1727204257.50913: checking for max_fail_percentage 22690 1727204257.50914: done checking for max_fail_percentage 22690 1727204257.50917: checking to see if all hosts have failed and the running result is not ok 22690 1727204257.50918: done checking to see if all hosts have failed 22690 1727204257.50919: getting the remaining hosts for this loop 22690 1727204257.50921: done getting the remaining hosts for this loop 22690 1727204257.50925: getting the next task for host managed-node2 22690 1727204257.50935: done getting next task for host managed-node2 22690 1727204257.50939: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204257.50942: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204257.50957: getting variables 22690 1727204257.50959: in VariableManager get_vars() 22690 1727204257.51005: Calling all_inventory to load vars for managed-node2 22690 1727204257.51008: Calling groups_inventory to load vars for managed-node2 22690 1727204257.51010: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204257.51024: Calling all_plugins_play to load vars for managed-node2 22690 1727204257.51027: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204257.51032: Calling groups_plugins_play to load vars for managed-node2 22690 1727204257.53296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204257.58232: done with get_vars() 22690 1727204257.58276: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.107) 0:00:24.868 ***** 22690 1727204257.58488: entering _queue_task() for managed-node2/service_facts 22690 1727204257.59036: worker is 1 (out of 1 available) 22690 1727204257.59052: exiting _queue_task() for managed-node2/service_facts 22690 1727204257.59070: done queuing things up, now waiting for results queue to drain 22690 1727204257.59072: waiting for pending results... 22690 1727204257.59469: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204257.59474: in run() - task 127b8e07-fff9-78bb-bf56-000000000371 22690 1727204257.59477: variable 'ansible_search_path' from source: unknown 22690 1727204257.59481: variable 'ansible_search_path' from source: unknown 22690 1727204257.59523: calling self._execute() 22690 1727204257.59629: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.59636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.59648: variable 'omit' from source: magic vars 22690 1727204257.60075: variable 'ansible_distribution_major_version' from source: facts 22690 1727204257.60088: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204257.60099: variable 'omit' from source: magic vars 22690 1727204257.60166: variable 'omit' from source: magic vars 22690 1727204257.60222: variable 'omit' from source: magic vars 22690 1727204257.60249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204257.60321: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204257.60327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204257.60333: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204257.60427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204257.60431: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204257.60433: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.60436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.60508: Set connection var ansible_connection to ssh 22690 1727204257.60521: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204257.60552: Set connection var ansible_pipelining to False 22690 1727204257.60555: Set connection var ansible_shell_type to sh 22690 1727204257.60558: Set connection var ansible_shell_executable to /bin/sh 22690 1727204257.60561: Set connection var ansible_timeout to 10 22690 1727204257.60574: variable 'ansible_shell_executable' from source: unknown 22690 1727204257.60577: variable 'ansible_connection' from source: unknown 22690 1727204257.60581: variable 'ansible_module_compression' from source: unknown 22690 1727204257.60583: variable 'ansible_shell_type' from source: unknown 22690 1727204257.60586: variable 'ansible_shell_executable' from source: unknown 22690 1727204257.60588: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204257.60644: variable 'ansible_pipelining' from source: unknown 22690 1727204257.60647: variable 'ansible_timeout' from source: unknown 22690 1727204257.60650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204257.60821: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204257.60830: variable 'omit' from source: magic vars 22690 1727204257.60835: starting attempt loop 22690 1727204257.60838: running the handler 22690 1727204257.60882: _low_level_execute_command(): starting 22690 1727204257.60886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204257.61648: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204257.61668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204257.61671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204257.61675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204257.61678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204257.61681: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204257.61929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204257.61933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204257.61936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204257.61939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204257.62039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204257.63764: stdout chunk (state=3): >>>/root <<< 22690 1727204257.63887: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204257.63993: stderr chunk (state=3): >>><<< 22690 1727204257.63997: stdout chunk (state=3): >>><<< 22690 1727204257.64036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204257.64040: _low_level_execute_command(): starting 22690 1727204257.64043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859 `" && echo ansible-tmp-1727204257.6402035-24361-82978818659859="` echo /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859 `" ) && sleep 0' 22690 1727204257.64788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204257.64810: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204257.64917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204257.66874: stdout chunk (state=3): >>>ansible-tmp-1727204257.6402035-24361-82978818659859=/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859 <<< 22690 1727204257.67048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204257.67169: stderr chunk (state=3): >>><<< 22690 1727204257.67173: stdout chunk (state=3): >>><<< 22690 1727204257.67177: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204257.6402035-24361-82978818659859=/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204257.67179: variable 'ansible_module_compression' from source: unknown 22690 1727204257.67182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22690 1727204257.67219: variable 'ansible_facts' from source: unknown 22690 1727204257.67282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py 22690 1727204257.67396: Sending initial data 22690 1727204257.67399: Sent initial data (161 bytes) 22690 1727204257.67911: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204257.67918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204257.67920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204257.67923: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204257.67926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204257.67979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204257.67982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204257.67984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204257.68068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204257.69646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204257.69707: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204257.69774: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp3nxqgi2i /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py <<< 22690 1727204257.69780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py" <<< 22690 1727204257.69843: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp3nxqgi2i" to remote "/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py" <<< 22690 1727204257.70693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204257.70799: stderr chunk (state=3): >>><<< 22690 1727204257.70803: stdout chunk (state=3): >>><<< 22690 1727204257.70822: done transferring module to remote 22690 1727204257.70858: _low_level_execute_command(): starting 22690 1727204257.70863: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/ /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py && sleep 0' 22690 1727204257.71486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204257.71491: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204257.71547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204257.71551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204257.71627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204257.73438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204257.73579: stderr chunk (state=3): >>><<< 22690 1727204257.73583: stdout chunk (state=3): >>><<< 22690 1727204257.73608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204257.73612: _low_level_execute_command(): starting 22690 1727204257.73619: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/AnsiballZ_service_facts.py && sleep 0' 22690 1727204257.74412: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204257.74432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204257.74458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204257.74547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204257.74645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204259.90570: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"<<< 22690 1727204259.90640: stdout chunk (state=3): >>>name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymout<<< 22690 1727204259.90678: stdout chunk (state=3): >>>h-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22690 1727204259.92445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204259.92477: stdout chunk (state=3): >>><<< 22690 1727204259.92497: stderr chunk (state=3): >>><<< 22690 1727204259.92576: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204259.93643: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204259.93648: _low_level_execute_command(): starting 22690 1727204259.93651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204257.6402035-24361-82978818659859/ > /dev/null 2>&1 && sleep 0' 22690 1727204259.94932: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204259.94942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204259.94952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204259.94968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204259.94987: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204259.94994: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204259.95003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204259.95017: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204259.95028: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204259.95034: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204259.95042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204259.95080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204259.95183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204259.95305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204259.95378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204259.97469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204259.97473: stdout chunk (state=3): >>><<< 22690 1727204259.97476: stderr chunk (state=3): >>><<< 22690 1727204259.97493: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204259.97500: handler run complete 22690 1727204259.98245: variable 'ansible_facts' from source: unknown 22690 1727204259.99173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204260.00249: variable 'ansible_facts' from source: unknown 22690 1727204260.00443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204260.00757: attempt loop complete, returning result 22690 1727204260.00762: _execute() done 22690 1727204260.00764: dumping result to json 22690 1727204260.00852: done dumping result, returning 22690 1727204260.00862: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-78bb-bf56-000000000371] 22690 1727204260.00868: sending task result for task 127b8e07-fff9-78bb-bf56-000000000371 22690 1727204260.02130: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000371 22690 1727204260.02140: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204260.02283: no more pending results, returning what we have 22690 1727204260.02287: results queue empty 22690 1727204260.02288: checking for any_errors_fatal 22690 1727204260.02293: done checking for any_errors_fatal 22690 1727204260.02294: checking for max_fail_percentage 22690 1727204260.02296: done checking for max_fail_percentage 22690 1727204260.02296: checking to see if all hosts have failed and the running result is not ok 22690 1727204260.02297: done checking to see if all hosts have failed 22690 1727204260.02298: getting the remaining hosts for this loop 22690 1727204260.02299: done getting the remaining hosts for this loop 22690 1727204260.02303: getting the next task for host managed-node2 22690 1727204260.02309: done getting next task for host managed-node2 22690 1727204260.02312: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204260.02318: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204260.02328: getting variables 22690 1727204260.02330: in VariableManager get_vars() 22690 1727204260.02364: Calling all_inventory to load vars for managed-node2 22690 1727204260.02475: Calling groups_inventory to load vars for managed-node2 22690 1727204260.02483: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204260.02494: Calling all_plugins_play to load vars for managed-node2 22690 1727204260.02497: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204260.02500: Calling groups_plugins_play to load vars for managed-node2 22690 1727204260.05058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204260.07559: done with get_vars() 22690 1727204260.07605: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:40 -0400 (0:00:02.492) 0:00:27.360 ***** 22690 1727204260.07726: entering _queue_task() for managed-node2/package_facts 22690 1727204260.08399: worker is 1 (out of 1 available) 22690 1727204260.08412: exiting _queue_task() for managed-node2/package_facts 22690 1727204260.08428: done queuing things up, now waiting for results queue to drain 22690 1727204260.08430: waiting for pending results... 22690 1727204260.08562: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204260.08705: in run() - task 127b8e07-fff9-78bb-bf56-000000000372 22690 1727204260.08766: variable 'ansible_search_path' from source: unknown 22690 1727204260.08772: variable 'ansible_search_path' from source: unknown 22690 1727204260.08786: calling self._execute() 22690 1727204260.08895: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204260.08908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204260.08924: variable 'omit' from source: magic vars 22690 1727204260.09423: variable 'ansible_distribution_major_version' from source: facts 22690 1727204260.09427: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204260.09431: variable 'omit' from source: magic vars 22690 1727204260.09452: variable 'omit' from source: magic vars 22690 1727204260.09497: variable 'omit' from source: magic vars 22690 1727204260.09556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204260.09605: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204260.09644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204260.09672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204260.09691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204260.09730: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204260.09746: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204260.09756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204260.09878: Set connection var ansible_connection to ssh 22690 1727204260.09965: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204260.09970: Set connection var ansible_pipelining to False 22690 1727204260.09973: Set connection var ansible_shell_type to sh 22690 1727204260.09975: Set connection var ansible_shell_executable to /bin/sh 22690 1727204260.09978: Set connection var ansible_timeout to 10 22690 1727204260.09980: variable 'ansible_shell_executable' from source: unknown 22690 1727204260.09983: variable 'ansible_connection' from source: unknown 22690 1727204260.09986: variable 'ansible_module_compression' from source: unknown 22690 1727204260.09988: variable 'ansible_shell_type' from source: unknown 22690 1727204260.09990: variable 'ansible_shell_executable' from source: unknown 22690 1727204260.10000: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204260.10009: variable 'ansible_pipelining' from source: unknown 22690 1727204260.10020: variable 'ansible_timeout' from source: unknown 22690 1727204260.10029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204260.10272: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204260.10298: variable 'omit' from source: magic vars 22690 1727204260.10308: starting attempt loop 22690 1727204260.10318: running the handler 22690 1727204260.10404: _low_level_execute_command(): starting 22690 1727204260.10407: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204260.11762: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204260.11851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204260.11877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.11925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.12175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.13898: stdout chunk (state=3): >>>/root <<< 22690 1727204260.14095: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204260.14100: stderr chunk (state=3): >>><<< 22690 1727204260.14102: stdout chunk (state=3): >>><<< 22690 1727204260.14278: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204260.14283: _low_level_execute_command(): starting 22690 1727204260.14286: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553 `" && echo ansible-tmp-1727204260.1412313-24458-62132881922553="` echo /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553 `" ) && sleep 0' 22690 1727204260.14860: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204260.14962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.15282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.15306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.17403: stdout chunk (state=3): >>>ansible-tmp-1727204260.1412313-24458-62132881922553=/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553 <<< 22690 1727204260.17672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204260.17679: stdout chunk (state=3): >>><<< 22690 1727204260.17682: stderr chunk (state=3): >>><<< 22690 1727204260.17684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204260.1412313-24458-62132881922553=/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204260.17687: variable 'ansible_module_compression' from source: unknown 22690 1727204260.17724: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22690 1727204260.17802: variable 'ansible_facts' from source: unknown 22690 1727204260.17936: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py 22690 1727204260.18243: Sending initial data 22690 1727204260.18249: Sent initial data (161 bytes) 22690 1727204260.18752: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204260.18771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204260.18775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204260.18795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204260.18907: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204260.18911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.18914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.19246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.20895: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204260.20961: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204260.21054: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpdyqkvzko /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py <<< 22690 1727204260.21058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py" <<< 22690 1727204260.21123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpdyqkvzko" to remote "/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py" <<< 22690 1727204260.22994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204260.23053: stderr chunk (state=3): >>><<< 22690 1727204260.23064: stdout chunk (state=3): >>><<< 22690 1727204260.23107: done transferring module to remote 22690 1727204260.23126: _low_level_execute_command(): starting 22690 1727204260.23192: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/ /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py && sleep 0' 22690 1727204260.23864: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204260.23886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204260.23903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204260.23969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204260.24034: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204260.24052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.24094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.24201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.26223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204260.26232: stdout chunk (state=3): >>><<< 22690 1727204260.26235: stderr chunk (state=3): >>><<< 22690 1727204260.26350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204260.26354: _low_level_execute_command(): starting 22690 1727204260.26357: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/AnsiballZ_package_facts.py && sleep 0' 22690 1727204260.26926: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204260.26930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204260.26948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204260.26952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204260.26972: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204260.26975: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204260.26984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204260.26998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204260.27006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204260.27017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204260.27031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204260.27376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.27483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.27597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.90447: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"na<<< 22690 1727204260.90642: stdout chunk (state=3): >>>me": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_<<< 22690 1727204260.90693: stdout chunk (state=3): >>>64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 22690 1727204260.90702: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22690 1727204260.92639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204260.92644: stdout chunk (state=3): >>><<< 22690 1727204260.92646: stderr chunk (state=3): >>><<< 22690 1727204260.92696: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204260.96348: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204260.96353: _low_level_execute_command(): starting 22690 1727204260.96356: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204260.1412313-24458-62132881922553/ > /dev/null 2>&1 && sleep 0' 22690 1727204260.96987: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204260.97005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204260.97109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204260.97138: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204260.97250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204260.99286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204260.99296: stdout chunk (state=3): >>><<< 22690 1727204260.99308: stderr chunk (state=3): >>><<< 22690 1727204260.99329: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204260.99341: handler run complete 22690 1727204261.00673: variable 'ansible_facts' from source: unknown 22690 1727204261.01090: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.03729: variable 'ansible_facts' from source: unknown 22690 1727204261.04302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.05405: attempt loop complete, returning result 22690 1727204261.05439: _execute() done 22690 1727204261.05449: dumping result to json 22690 1727204261.05747: done dumping result, returning 22690 1727204261.05762: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-78bb-bf56-000000000372] 22690 1727204261.05774: sending task result for task 127b8e07-fff9-78bb-bf56-000000000372 22690 1727204261.08973: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000372 22690 1727204261.08977: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204261.09201: no more pending results, returning what we have 22690 1727204261.09205: results queue empty 22690 1727204261.09206: checking for any_errors_fatal 22690 1727204261.09211: done checking for any_errors_fatal 22690 1727204261.09212: checking for max_fail_percentage 22690 1727204261.09213: done checking for max_fail_percentage 22690 1727204261.09214: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.09218: done checking to see if all hosts have failed 22690 1727204261.09219: getting the remaining hosts for this loop 22690 1727204261.09220: done getting the remaining hosts for this loop 22690 1727204261.09224: getting the next task for host managed-node2 22690 1727204261.09232: done getting next task for host managed-node2 22690 1727204261.09235: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204261.09238: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.09249: getting variables 22690 1727204261.09250: in VariableManager get_vars() 22690 1727204261.09291: Calling all_inventory to load vars for managed-node2 22690 1727204261.09295: Calling groups_inventory to load vars for managed-node2 22690 1727204261.09297: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.09307: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.09310: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.09313: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.11042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.13320: done with get_vars() 22690 1727204261.13362: done getting variables 22690 1727204261.13445: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:41 -0400 (0:00:01.057) 0:00:28.418 ***** 22690 1727204261.13482: entering _queue_task() for managed-node2/debug 22690 1727204261.14071: worker is 1 (out of 1 available) 22690 1727204261.14085: exiting _queue_task() for managed-node2/debug 22690 1727204261.14112: done queuing things up, now waiting for results queue to drain 22690 1727204261.14114: waiting for pending results... 22690 1727204261.14486: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204261.14507: in run() - task 127b8e07-fff9-78bb-bf56-00000000003d 22690 1727204261.14533: variable 'ansible_search_path' from source: unknown 22690 1727204261.14541: variable 'ansible_search_path' from source: unknown 22690 1727204261.14591: calling self._execute() 22690 1727204261.14694: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.14705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.14721: variable 'omit' from source: magic vars 22690 1727204261.15232: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.15244: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.15251: variable 'omit' from source: magic vars 22690 1727204261.15306: variable 'omit' from source: magic vars 22690 1727204261.15430: variable 'network_provider' from source: set_fact 22690 1727204261.15450: variable 'omit' from source: magic vars 22690 1727204261.15507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204261.15671: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204261.15675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204261.15678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204261.15681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204261.15683: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204261.15686: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.15688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.15779: Set connection var ansible_connection to ssh 22690 1727204261.15790: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204261.15799: Set connection var ansible_pipelining to False 22690 1727204261.15802: Set connection var ansible_shell_type to sh 22690 1727204261.15814: Set connection var ansible_shell_executable to /bin/sh 22690 1727204261.15826: Set connection var ansible_timeout to 10 22690 1727204261.15851: variable 'ansible_shell_executable' from source: unknown 22690 1727204261.15854: variable 'ansible_connection' from source: unknown 22690 1727204261.15857: variable 'ansible_module_compression' from source: unknown 22690 1727204261.15860: variable 'ansible_shell_type' from source: unknown 22690 1727204261.15862: variable 'ansible_shell_executable' from source: unknown 22690 1727204261.15867: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.15870: variable 'ansible_pipelining' from source: unknown 22690 1727204261.15872: variable 'ansible_timeout' from source: unknown 22690 1727204261.15971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.16057: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204261.16071: variable 'omit' from source: magic vars 22690 1727204261.16077: starting attempt loop 22690 1727204261.16080: running the handler 22690 1727204261.16136: handler run complete 22690 1727204261.16159: attempt loop complete, returning result 22690 1727204261.16162: _execute() done 22690 1727204261.16167: dumping result to json 22690 1727204261.16170: done dumping result, returning 22690 1727204261.16251: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-78bb-bf56-00000000003d] 22690 1727204261.16254: sending task result for task 127b8e07-fff9-78bb-bf56-00000000003d 22690 1727204261.16324: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000003d 22690 1727204261.16327: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 22690 1727204261.16419: no more pending results, returning what we have 22690 1727204261.16423: results queue empty 22690 1727204261.16424: checking for any_errors_fatal 22690 1727204261.16432: done checking for any_errors_fatal 22690 1727204261.16433: checking for max_fail_percentage 22690 1727204261.16434: done checking for max_fail_percentage 22690 1727204261.16435: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.16436: done checking to see if all hosts have failed 22690 1727204261.16437: getting the remaining hosts for this loop 22690 1727204261.16438: done getting the remaining hosts for this loop 22690 1727204261.16442: getting the next task for host managed-node2 22690 1727204261.16448: done getting next task for host managed-node2 22690 1727204261.16451: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204261.16453: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.16462: getting variables 22690 1727204261.16463: in VariableManager get_vars() 22690 1727204261.16501: Calling all_inventory to load vars for managed-node2 22690 1727204261.16504: Calling groups_inventory to load vars for managed-node2 22690 1727204261.16506: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.16516: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.16518: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.16521: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.18458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.20822: done with get_vars() 22690 1727204261.20864: done getting variables 22690 1727204261.20931: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.074) 0:00:28.493 ***** 22690 1727204261.20968: entering _queue_task() for managed-node2/fail 22690 1727204261.21345: worker is 1 (out of 1 available) 22690 1727204261.21360: exiting _queue_task() for managed-node2/fail 22690 1727204261.21376: done queuing things up, now waiting for results queue to drain 22690 1727204261.21377: waiting for pending results... 22690 1727204261.21804: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204261.21859: in run() - task 127b8e07-fff9-78bb-bf56-00000000003e 22690 1727204261.21897: variable 'ansible_search_path' from source: unknown 22690 1727204261.21907: variable 'ansible_search_path' from source: unknown 22690 1727204261.21954: calling self._execute() 22690 1727204261.22079: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.22093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.22117: variable 'omit' from source: magic vars 22690 1727204261.22663: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.22669: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.22749: variable 'network_state' from source: role '' defaults 22690 1727204261.22777: Evaluated conditional (network_state != {}): False 22690 1727204261.22785: when evaluation is False, skipping this task 22690 1727204261.22794: _execute() done 22690 1727204261.22803: dumping result to json 22690 1727204261.22811: done dumping result, returning 22690 1727204261.22823: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-78bb-bf56-00000000003e] 22690 1727204261.22834: sending task result for task 127b8e07-fff9-78bb-bf56-00000000003e 22690 1727204261.23172: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000003e 22690 1727204261.23176: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204261.23224: no more pending results, returning what we have 22690 1727204261.23229: results queue empty 22690 1727204261.23229: checking for any_errors_fatal 22690 1727204261.23236: done checking for any_errors_fatal 22690 1727204261.23237: checking for max_fail_percentage 22690 1727204261.23239: done checking for max_fail_percentage 22690 1727204261.23240: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.23241: done checking to see if all hosts have failed 22690 1727204261.23241: getting the remaining hosts for this loop 22690 1727204261.23243: done getting the remaining hosts for this loop 22690 1727204261.23247: getting the next task for host managed-node2 22690 1727204261.23254: done getting next task for host managed-node2 22690 1727204261.23259: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204261.23261: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.23279: getting variables 22690 1727204261.23281: in VariableManager get_vars() 22690 1727204261.23324: Calling all_inventory to load vars for managed-node2 22690 1727204261.23328: Calling groups_inventory to load vars for managed-node2 22690 1727204261.23330: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.23344: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.23348: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.23352: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.25208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.27658: done with get_vars() 22690 1727204261.27690: done getting variables 22690 1727204261.27764: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.068) 0:00:28.561 ***** 22690 1727204261.27804: entering _queue_task() for managed-node2/fail 22690 1727204261.28205: worker is 1 (out of 1 available) 22690 1727204261.28222: exiting _queue_task() for managed-node2/fail 22690 1727204261.28237: done queuing things up, now waiting for results queue to drain 22690 1727204261.28238: waiting for pending results... 22690 1727204261.28535: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204261.28681: in run() - task 127b8e07-fff9-78bb-bf56-00000000003f 22690 1727204261.28712: variable 'ansible_search_path' from source: unknown 22690 1727204261.28802: variable 'ansible_search_path' from source: unknown 22690 1727204261.28806: calling self._execute() 22690 1727204261.28883: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.28898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.28921: variable 'omit' from source: magic vars 22690 1727204261.29328: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.29352: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.29493: variable 'network_state' from source: role '' defaults 22690 1727204261.29511: Evaluated conditional (network_state != {}): False 22690 1727204261.29519: when evaluation is False, skipping this task 22690 1727204261.29527: _execute() done 22690 1727204261.29536: dumping result to json 22690 1727204261.29544: done dumping result, returning 22690 1727204261.29562: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-78bb-bf56-00000000003f] 22690 1727204261.29672: sending task result for task 127b8e07-fff9-78bb-bf56-00000000003f 22690 1727204261.29761: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000003f 22690 1727204261.29765: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204261.29918: no more pending results, returning what we have 22690 1727204261.29921: results queue empty 22690 1727204261.29922: checking for any_errors_fatal 22690 1727204261.29928: done checking for any_errors_fatal 22690 1727204261.29929: checking for max_fail_percentage 22690 1727204261.29930: done checking for max_fail_percentage 22690 1727204261.29931: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.29932: done checking to see if all hosts have failed 22690 1727204261.29932: getting the remaining hosts for this loop 22690 1727204261.29934: done getting the remaining hosts for this loop 22690 1727204261.29937: getting the next task for host managed-node2 22690 1727204261.29943: done getting next task for host managed-node2 22690 1727204261.29946: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204261.29948: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.29962: getting variables 22690 1727204261.29963: in VariableManager get_vars() 22690 1727204261.30072: Calling all_inventory to load vars for managed-node2 22690 1727204261.30075: Calling groups_inventory to load vars for managed-node2 22690 1727204261.30077: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.30087: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.30090: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.30093: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.37552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.39679: done with get_vars() 22690 1727204261.39720: done getting variables 22690 1727204261.39782: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.120) 0:00:28.681 ***** 22690 1727204261.39813: entering _queue_task() for managed-node2/fail 22690 1727204261.40213: worker is 1 (out of 1 available) 22690 1727204261.40231: exiting _queue_task() for managed-node2/fail 22690 1727204261.40245: done queuing things up, now waiting for results queue to drain 22690 1727204261.40247: waiting for pending results... 22690 1727204261.40589: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204261.40641: in run() - task 127b8e07-fff9-78bb-bf56-000000000040 22690 1727204261.40668: variable 'ansible_search_path' from source: unknown 22690 1727204261.40677: variable 'ansible_search_path' from source: unknown 22690 1727204261.40729: calling self._execute() 22690 1727204261.40839: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.40856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.40875: variable 'omit' from source: magic vars 22690 1727204261.41304: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.41324: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.41574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204261.44171: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204261.44417: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204261.44467: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204261.44509: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204261.44572: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204261.44636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.44676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.44706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.44748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.44771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.44971: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.44975: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22690 1727204261.45028: variable 'ansible_distribution' from source: facts 22690 1727204261.45039: variable '__network_rh_distros' from source: role '' defaults 22690 1727204261.45053: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22690 1727204261.45060: when evaluation is False, skipping this task 22690 1727204261.45071: _execute() done 22690 1727204261.45079: dumping result to json 22690 1727204261.45087: done dumping result, returning 22690 1727204261.45099: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-78bb-bf56-000000000040] 22690 1727204261.45108: sending task result for task 127b8e07-fff9-78bb-bf56-000000000040 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22690 1727204261.45284: no more pending results, returning what we have 22690 1727204261.45288: results queue empty 22690 1727204261.45289: checking for any_errors_fatal 22690 1727204261.45299: done checking for any_errors_fatal 22690 1727204261.45299: checking for max_fail_percentage 22690 1727204261.45301: done checking for max_fail_percentage 22690 1727204261.45302: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.45303: done checking to see if all hosts have failed 22690 1727204261.45304: getting the remaining hosts for this loop 22690 1727204261.45305: done getting the remaining hosts for this loop 22690 1727204261.45310: getting the next task for host managed-node2 22690 1727204261.45318: done getting next task for host managed-node2 22690 1727204261.45322: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204261.45324: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.45341: getting variables 22690 1727204261.45343: in VariableManager get_vars() 22690 1727204261.45387: Calling all_inventory to load vars for managed-node2 22690 1727204261.45390: Calling groups_inventory to load vars for managed-node2 22690 1727204261.45392: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.45405: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.45408: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.45410: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.46085: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000040 22690 1727204261.46090: WORKER PROCESS EXITING 22690 1727204261.47355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.49782: done with get_vars() 22690 1727204261.49818: done getting variables 22690 1727204261.49889: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.101) 0:00:28.782 ***** 22690 1727204261.49923: entering _queue_task() for managed-node2/dnf 22690 1727204261.50317: worker is 1 (out of 1 available) 22690 1727204261.50331: exiting _queue_task() for managed-node2/dnf 22690 1727204261.50346: done queuing things up, now waiting for results queue to drain 22690 1727204261.50348: waiting for pending results... 22690 1727204261.50787: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204261.50817: in run() - task 127b8e07-fff9-78bb-bf56-000000000041 22690 1727204261.50843: variable 'ansible_search_path' from source: unknown 22690 1727204261.50853: variable 'ansible_search_path' from source: unknown 22690 1727204261.50913: calling self._execute() 22690 1727204261.51025: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.51040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.51057: variable 'omit' from source: magic vars 22690 1727204261.52072: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.52076: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.52441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204261.55911: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204261.56013: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204261.56067: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204261.56112: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204261.56151: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204261.56246: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.56291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.56327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.56385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.56408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.56550: variable 'ansible_distribution' from source: facts 22690 1727204261.56562: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.56585: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22690 1727204261.56746: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204261.57341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.57345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.57348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.57350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.57353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.57355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.57358: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.57360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.57495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.57516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.57564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.57662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.57702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.57747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.57767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.57954: variable 'network_connections' from source: play vars 22690 1727204261.57975: variable 'profile' from source: play vars 22690 1727204261.58064: variable 'profile' from source: play vars 22690 1727204261.58100: variable 'interface' from source: set_fact 22690 1727204261.58201: variable 'interface' from source: set_fact 22690 1727204261.58290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204261.58755: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204261.58939: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204261.58943: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204261.58953: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204261.59020: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204261.59048: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204261.59096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.59131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204261.59191: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204261.59719: variable 'network_connections' from source: play vars 22690 1727204261.59763: variable 'profile' from source: play vars 22690 1727204261.60047: variable 'profile' from source: play vars 22690 1727204261.60157: variable 'interface' from source: set_fact 22690 1727204261.60160: variable 'interface' from source: set_fact 22690 1727204261.60165: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204261.60175: when evaluation is False, skipping this task 22690 1727204261.60182: _execute() done 22690 1727204261.60188: dumping result to json 22690 1727204261.60194: done dumping result, returning 22690 1727204261.60204: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000041] 22690 1727204261.60274: sending task result for task 127b8e07-fff9-78bb-bf56-000000000041 22690 1727204261.60572: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000041 22690 1727204261.60575: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204261.60625: no more pending results, returning what we have 22690 1727204261.60629: results queue empty 22690 1727204261.60630: checking for any_errors_fatal 22690 1727204261.60637: done checking for any_errors_fatal 22690 1727204261.60638: checking for max_fail_percentage 22690 1727204261.60639: done checking for max_fail_percentage 22690 1727204261.60640: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.60640: done checking to see if all hosts have failed 22690 1727204261.60641: getting the remaining hosts for this loop 22690 1727204261.60643: done getting the remaining hosts for this loop 22690 1727204261.60647: getting the next task for host managed-node2 22690 1727204261.60653: done getting next task for host managed-node2 22690 1727204261.60657: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204261.60659: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.60676: getting variables 22690 1727204261.60678: in VariableManager get_vars() 22690 1727204261.60720: Calling all_inventory to load vars for managed-node2 22690 1727204261.60724: Calling groups_inventory to load vars for managed-node2 22690 1727204261.60726: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.60738: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.60742: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.60745: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.64589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.69128: done with get_vars() 22690 1727204261.69163: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204261.69252: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.195) 0:00:28.978 ***** 22690 1727204261.69494: entering _queue_task() for managed-node2/yum 22690 1727204261.70316: worker is 1 (out of 1 available) 22690 1727204261.70331: exiting _queue_task() for managed-node2/yum 22690 1727204261.70344: done queuing things up, now waiting for results queue to drain 22690 1727204261.70346: waiting for pending results... 22690 1727204261.70749: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204261.70972: in run() - task 127b8e07-fff9-78bb-bf56-000000000042 22690 1727204261.71011: variable 'ansible_search_path' from source: unknown 22690 1727204261.71015: variable 'ansible_search_path' from source: unknown 22690 1727204261.71150: calling self._execute() 22690 1727204261.71336: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.71341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.71555: variable 'omit' from source: magic vars 22690 1727204261.72315: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.72441: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.72759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204261.78073: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204261.78078: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204261.78117: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204261.78169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204261.78206: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204261.78431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.78435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.78438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.78441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.78444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.78556: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.78649: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22690 1727204261.78652: when evaluation is False, skipping this task 22690 1727204261.78655: _execute() done 22690 1727204261.78657: dumping result to json 22690 1727204261.78660: done dumping result, returning 22690 1727204261.78662: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000042] 22690 1727204261.78665: sending task result for task 127b8e07-fff9-78bb-bf56-000000000042 22690 1727204261.78738: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000042 22690 1727204261.78741: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22690 1727204261.78807: no more pending results, returning what we have 22690 1727204261.78811: results queue empty 22690 1727204261.78812: checking for any_errors_fatal 22690 1727204261.78823: done checking for any_errors_fatal 22690 1727204261.78824: checking for max_fail_percentage 22690 1727204261.78825: done checking for max_fail_percentage 22690 1727204261.78826: checking to see if all hosts have failed and the running result is not ok 22690 1727204261.78827: done checking to see if all hosts have failed 22690 1727204261.78828: getting the remaining hosts for this loop 22690 1727204261.78829: done getting the remaining hosts for this loop 22690 1727204261.78834: getting the next task for host managed-node2 22690 1727204261.78840: done getting next task for host managed-node2 22690 1727204261.78844: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204261.78846: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204261.78860: getting variables 22690 1727204261.78862: in VariableManager get_vars() 22690 1727204261.78901: Calling all_inventory to load vars for managed-node2 22690 1727204261.78904: Calling groups_inventory to load vars for managed-node2 22690 1727204261.78907: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204261.78917: Calling all_plugins_play to load vars for managed-node2 22690 1727204261.78920: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204261.78923: Calling groups_plugins_play to load vars for managed-node2 22690 1727204261.82540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204261.86932: done with get_vars() 22690 1727204261.86970: done getting variables 22690 1727204261.87041: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.175) 0:00:29.154 ***** 22690 1727204261.87079: entering _queue_task() for managed-node2/fail 22690 1727204261.87452: worker is 1 (out of 1 available) 22690 1727204261.87570: exiting _queue_task() for managed-node2/fail 22690 1727204261.87585: done queuing things up, now waiting for results queue to drain 22690 1727204261.87586: waiting for pending results... 22690 1727204261.87886: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204261.87939: in run() - task 127b8e07-fff9-78bb-bf56-000000000043 22690 1727204261.87962: variable 'ansible_search_path' from source: unknown 22690 1727204261.87974: variable 'ansible_search_path' from source: unknown 22690 1727204261.88072: calling self._execute() 22690 1727204261.88133: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204261.88147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204261.88163: variable 'omit' from source: magic vars 22690 1727204261.88875: variable 'ansible_distribution_major_version' from source: facts 22690 1727204261.88879: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204261.89150: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204261.89679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204261.94986: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204261.95875: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204261.96138: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204261.96176: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204261.96205: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204261.96511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.96547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.96577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.96637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.96641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.96901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.96935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.96959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.97003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.97043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.97072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204261.97304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204261.97372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.97377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204261.97391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204261.97805: variable 'network_connections' from source: play vars 22690 1727204261.97815: variable 'profile' from source: play vars 22690 1727204261.98110: variable 'profile' from source: play vars 22690 1727204261.98113: variable 'interface' from source: set_fact 22690 1727204261.98269: variable 'interface' from source: set_fact 22690 1727204261.98493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204261.98896: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204261.98941: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204261.99013: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204261.99036: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204261.99145: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204261.99171: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204261.99251: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204261.99340: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204261.99389: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204262.00074: variable 'network_connections' from source: play vars 22690 1727204262.00082: variable 'profile' from source: play vars 22690 1727204262.00157: variable 'profile' from source: play vars 22690 1727204262.00161: variable 'interface' from source: set_fact 22690 1727204262.00232: variable 'interface' from source: set_fact 22690 1727204262.00259: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204262.00262: when evaluation is False, skipping this task 22690 1727204262.00267: _execute() done 22690 1727204262.00475: dumping result to json 22690 1727204262.00478: done dumping result, returning 22690 1727204262.00545: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000043] 22690 1727204262.00558: sending task result for task 127b8e07-fff9-78bb-bf56-000000000043 22690 1727204262.00633: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000043 22690 1727204262.00770: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204262.00830: no more pending results, returning what we have 22690 1727204262.00835: results queue empty 22690 1727204262.00836: checking for any_errors_fatal 22690 1727204262.00842: done checking for any_errors_fatal 22690 1727204262.00843: checking for max_fail_percentage 22690 1727204262.00845: done checking for max_fail_percentage 22690 1727204262.00846: checking to see if all hosts have failed and the running result is not ok 22690 1727204262.00846: done checking to see if all hosts have failed 22690 1727204262.00847: getting the remaining hosts for this loop 22690 1727204262.00849: done getting the remaining hosts for this loop 22690 1727204262.00854: getting the next task for host managed-node2 22690 1727204262.00862: done getting next task for host managed-node2 22690 1727204262.00872: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22690 1727204262.00874: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204262.00890: getting variables 22690 1727204262.00892: in VariableManager get_vars() 22690 1727204262.00933: Calling all_inventory to load vars for managed-node2 22690 1727204262.00935: Calling groups_inventory to load vars for managed-node2 22690 1727204262.00937: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204262.00949: Calling all_plugins_play to load vars for managed-node2 22690 1727204262.00952: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204262.00955: Calling groups_plugins_play to load vars for managed-node2 22690 1727204262.03652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204262.07889: done with get_vars() 22690 1727204262.07929: done getting variables 22690 1727204262.08213: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.211) 0:00:29.365 ***** 22690 1727204262.08248: entering _queue_task() for managed-node2/package 22690 1727204262.09157: worker is 1 (out of 1 available) 22690 1727204262.09375: exiting _queue_task() for managed-node2/package 22690 1727204262.09389: done queuing things up, now waiting for results queue to drain 22690 1727204262.09390: waiting for pending results... 22690 1727204262.09840: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22690 1727204262.10227: in run() - task 127b8e07-fff9-78bb-bf56-000000000044 22690 1727204262.10233: variable 'ansible_search_path' from source: unknown 22690 1727204262.10237: variable 'ansible_search_path' from source: unknown 22690 1727204262.10339: calling self._execute() 22690 1727204262.10802: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204262.10806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204262.10809: variable 'omit' from source: magic vars 22690 1727204262.11689: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.11694: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204262.12271: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204262.12732: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204262.13032: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204262.13078: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204262.13399: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204262.13703: variable 'network_packages' from source: role '' defaults 22690 1727204262.14001: variable '__network_provider_setup' from source: role '' defaults 22690 1727204262.14093: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204262.14372: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204262.14375: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204262.14378: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204262.15074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204262.19757: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204262.19961: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204262.20126: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204262.20169: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204262.20422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204262.20468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.20570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.20673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.20723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.20965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.20972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.20975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.21170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.21174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.21178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.21947: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204262.22049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.22286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.22290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.22292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.22398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.22598: variable 'ansible_python' from source: facts 22690 1727204262.22746: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204262.23036: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204262.23253: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204262.23552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.23663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.23707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.23755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.23834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.24000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.24092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.24130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.24283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.24285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.24664: variable 'network_connections' from source: play vars 22690 1727204262.24903: variable 'profile' from source: play vars 22690 1727204262.25030: variable 'profile' from source: play vars 22690 1727204262.25125: variable 'interface' from source: set_fact 22690 1727204262.25875: variable 'interface' from source: set_fact 22690 1727204262.25881: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204262.25883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204262.25886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.25889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204262.26178: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204262.26876: variable 'network_connections' from source: play vars 22690 1727204262.26886: variable 'profile' from source: play vars 22690 1727204262.27172: variable 'profile' from source: play vars 22690 1727204262.27176: variable 'interface' from source: set_fact 22690 1727204262.27233: variable 'interface' from source: set_fact 22690 1727204262.27374: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204262.27581: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204262.28312: variable 'network_connections' from source: play vars 22690 1727204262.28389: variable 'profile' from source: play vars 22690 1727204262.28671: variable 'profile' from source: play vars 22690 1727204262.28674: variable 'interface' from source: set_fact 22690 1727204262.28820: variable 'interface' from source: set_fact 22690 1727204262.28978: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204262.29264: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204262.30129: variable 'network_connections' from source: play vars 22690 1727204262.30135: variable 'profile' from source: play vars 22690 1727204262.30181: variable 'profile' from source: play vars 22690 1727204262.30250: variable 'interface' from source: set_fact 22690 1727204262.30450: variable 'interface' from source: set_fact 22690 1727204262.30673: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204262.30847: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204262.30903: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204262.30992: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204262.31606: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204262.32988: variable 'network_connections' from source: play vars 22690 1727204262.33001: variable 'profile' from source: play vars 22690 1727204262.33245: variable 'profile' from source: play vars 22690 1727204262.33249: variable 'interface' from source: set_fact 22690 1727204262.33472: variable 'interface' from source: set_fact 22690 1727204262.33476: variable 'ansible_distribution' from source: facts 22690 1727204262.33479: variable '__network_rh_distros' from source: role '' defaults 22690 1727204262.33481: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.33487: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204262.33931: variable 'ansible_distribution' from source: facts 22690 1727204262.33935: variable '__network_rh_distros' from source: role '' defaults 22690 1727204262.33938: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.33947: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204262.34367: variable 'ansible_distribution' from source: facts 22690 1727204262.34371: variable '__network_rh_distros' from source: role '' defaults 22690 1727204262.34502: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.34771: variable 'network_provider' from source: set_fact 22690 1727204262.34774: variable 'ansible_facts' from source: unknown 22690 1727204262.36671: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22690 1727204262.36684: when evaluation is False, skipping this task 22690 1727204262.36687: _execute() done 22690 1727204262.36690: dumping result to json 22690 1727204262.36695: done dumping result, returning 22690 1727204262.36708: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-78bb-bf56-000000000044] 22690 1727204262.36713: sending task result for task 127b8e07-fff9-78bb-bf56-000000000044 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22690 1727204262.36887: no more pending results, returning what we have 22690 1727204262.36891: results queue empty 22690 1727204262.36892: checking for any_errors_fatal 22690 1727204262.36901: done checking for any_errors_fatal 22690 1727204262.36901: checking for max_fail_percentage 22690 1727204262.36903: done checking for max_fail_percentage 22690 1727204262.36904: checking to see if all hosts have failed and the running result is not ok 22690 1727204262.36905: done checking to see if all hosts have failed 22690 1727204262.36905: getting the remaining hosts for this loop 22690 1727204262.36907: done getting the remaining hosts for this loop 22690 1727204262.36911: getting the next task for host managed-node2 22690 1727204262.36918: done getting next task for host managed-node2 22690 1727204262.36922: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204262.36924: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204262.36939: getting variables 22690 1727204262.36941: in VariableManager get_vars() 22690 1727204262.36982: Calling all_inventory to load vars for managed-node2 22690 1727204262.36984: Calling groups_inventory to load vars for managed-node2 22690 1727204262.36987: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204262.36999: Calling all_plugins_play to load vars for managed-node2 22690 1727204262.37008: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204262.37012: Calling groups_plugins_play to load vars for managed-node2 22690 1727204262.37899: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000044 22690 1727204262.37902: WORKER PROCESS EXITING 22690 1727204262.40431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204262.45228: done with get_vars() 22690 1727204262.45259: done getting variables 22690 1727204262.45327: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.371) 0:00:29.736 ***** 22690 1727204262.45361: entering _queue_task() for managed-node2/package 22690 1727204262.46155: worker is 1 (out of 1 available) 22690 1727204262.46373: exiting _queue_task() for managed-node2/package 22690 1727204262.46388: done queuing things up, now waiting for results queue to drain 22690 1727204262.46389: waiting for pending results... 22690 1727204262.46903: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204262.47068: in run() - task 127b8e07-fff9-78bb-bf56-000000000045 22690 1727204262.47154: variable 'ansible_search_path' from source: unknown 22690 1727204262.47164: variable 'ansible_search_path' from source: unknown 22690 1727204262.47212: calling self._execute() 22690 1727204262.47479: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204262.47483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204262.47487: variable 'omit' from source: magic vars 22690 1727204262.48477: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.48501: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204262.48781: variable 'network_state' from source: role '' defaults 22690 1727204262.48972: Evaluated conditional (network_state != {}): False 22690 1727204262.48975: when evaluation is False, skipping this task 22690 1727204262.48978: _execute() done 22690 1727204262.48980: dumping result to json 22690 1727204262.48983: done dumping result, returning 22690 1727204262.48985: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000045] 22690 1727204262.48988: sending task result for task 127b8e07-fff9-78bb-bf56-000000000045 22690 1727204262.49205: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000045 22690 1727204262.49208: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204262.49261: no more pending results, returning what we have 22690 1727204262.49267: results queue empty 22690 1727204262.49268: checking for any_errors_fatal 22690 1727204262.49276: done checking for any_errors_fatal 22690 1727204262.49277: checking for max_fail_percentage 22690 1727204262.49280: done checking for max_fail_percentage 22690 1727204262.49281: checking to see if all hosts have failed and the running result is not ok 22690 1727204262.49282: done checking to see if all hosts have failed 22690 1727204262.49283: getting the remaining hosts for this loop 22690 1727204262.49285: done getting the remaining hosts for this loop 22690 1727204262.49290: getting the next task for host managed-node2 22690 1727204262.49297: done getting next task for host managed-node2 22690 1727204262.49302: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204262.49304: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204262.49322: getting variables 22690 1727204262.49324: in VariableManager get_vars() 22690 1727204262.49581: Calling all_inventory to load vars for managed-node2 22690 1727204262.49585: Calling groups_inventory to load vars for managed-node2 22690 1727204262.49588: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204262.49600: Calling all_plugins_play to load vars for managed-node2 22690 1727204262.49604: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204262.49606: Calling groups_plugins_play to load vars for managed-node2 22690 1727204262.53607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204262.58202: done with get_vars() 22690 1727204262.58245: done getting variables 22690 1727204262.58316: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.129) 0:00:29.866 ***** 22690 1727204262.58353: entering _queue_task() for managed-node2/package 22690 1727204262.59850: worker is 1 (out of 1 available) 22690 1727204262.59862: exiting _queue_task() for managed-node2/package 22690 1727204262.59877: done queuing things up, now waiting for results queue to drain 22690 1727204262.59878: waiting for pending results... 22690 1727204262.60342: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204262.60674: in run() - task 127b8e07-fff9-78bb-bf56-000000000046 22690 1727204262.60678: variable 'ansible_search_path' from source: unknown 22690 1727204262.60681: variable 'ansible_search_path' from source: unknown 22690 1727204262.60684: calling self._execute() 22690 1727204262.60932: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204262.60946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204262.60962: variable 'omit' from source: magic vars 22690 1727204262.61832: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.61850: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204262.62121: variable 'network_state' from source: role '' defaults 22690 1727204262.62201: Evaluated conditional (network_state != {}): False 22690 1727204262.62402: when evaluation is False, skipping this task 22690 1727204262.62406: _execute() done 22690 1727204262.62408: dumping result to json 22690 1727204262.62410: done dumping result, returning 22690 1727204262.62413: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000046] 22690 1727204262.62415: sending task result for task 127b8e07-fff9-78bb-bf56-000000000046 22690 1727204262.62671: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000046 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204262.62728: no more pending results, returning what we have 22690 1727204262.62733: results queue empty 22690 1727204262.62734: checking for any_errors_fatal 22690 1727204262.62745: done checking for any_errors_fatal 22690 1727204262.62746: checking for max_fail_percentage 22690 1727204262.62748: done checking for max_fail_percentage 22690 1727204262.62749: checking to see if all hosts have failed and the running result is not ok 22690 1727204262.62750: done checking to see if all hosts have failed 22690 1727204262.62751: getting the remaining hosts for this loop 22690 1727204262.62753: done getting the remaining hosts for this loop 22690 1727204262.62758: getting the next task for host managed-node2 22690 1727204262.62767: done getting next task for host managed-node2 22690 1727204262.62772: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204262.62774: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204262.62791: getting variables 22690 1727204262.62793: in VariableManager get_vars() 22690 1727204262.62841: Calling all_inventory to load vars for managed-node2 22690 1727204262.62844: Calling groups_inventory to load vars for managed-node2 22690 1727204262.62847: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204262.62862: Calling all_plugins_play to load vars for managed-node2 22690 1727204262.63170: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204262.63177: Calling groups_plugins_play to load vars for managed-node2 22690 1727204262.64005: WORKER PROCESS EXITING 22690 1727204262.67733: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204262.72976: done with get_vars() 22690 1727204262.73014: done getting variables 22690 1727204262.73084: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.147) 0:00:30.014 ***** 22690 1727204262.73119: entering _queue_task() for managed-node2/service 22690 1727204262.73916: worker is 1 (out of 1 available) 22690 1727204262.73929: exiting _queue_task() for managed-node2/service 22690 1727204262.73942: done queuing things up, now waiting for results queue to drain 22690 1727204262.73943: waiting for pending results... 22690 1727204262.74702: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204262.74917: in run() - task 127b8e07-fff9-78bb-bf56-000000000047 22690 1727204262.75003: variable 'ansible_search_path' from source: unknown 22690 1727204262.75013: variable 'ansible_search_path' from source: unknown 22690 1727204262.75187: calling self._execute() 22690 1727204262.75368: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204262.75533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204262.75554: variable 'omit' from source: magic vars 22690 1727204262.76842: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.76921: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204262.77574: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204262.78200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204262.81833: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204262.81838: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204262.81849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204262.81896: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204262.82072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204262.82077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.82080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.82413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.82421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.82424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.82527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.82560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.82604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.82660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.82683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.82742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.82775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.82807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.82861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.82883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.83093: variable 'network_connections' from source: play vars 22690 1727204262.83113: variable 'profile' from source: play vars 22690 1727204262.83207: variable 'profile' from source: play vars 22690 1727204262.83220: variable 'interface' from source: set_fact 22690 1727204262.83339: variable 'interface' from source: set_fact 22690 1727204262.83678: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204262.83983: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204262.84038: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204262.84195: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204262.84235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204262.84472: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204262.84477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204262.84702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.84706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204262.84708: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204262.85040: variable 'network_connections' from source: play vars 22690 1727204262.85052: variable 'profile' from source: play vars 22690 1727204262.85130: variable 'profile' from source: play vars 22690 1727204262.85145: variable 'interface' from source: set_fact 22690 1727204262.85214: variable 'interface' from source: set_fact 22690 1727204262.85255: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204262.85263: when evaluation is False, skipping this task 22690 1727204262.85274: _execute() done 22690 1727204262.85281: dumping result to json 22690 1727204262.85288: done dumping result, returning 22690 1727204262.85300: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000047] 22690 1727204262.85321: sending task result for task 127b8e07-fff9-78bb-bf56-000000000047 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204262.85510: no more pending results, returning what we have 22690 1727204262.85517: results queue empty 22690 1727204262.85518: checking for any_errors_fatal 22690 1727204262.85525: done checking for any_errors_fatal 22690 1727204262.85526: checking for max_fail_percentage 22690 1727204262.85528: done checking for max_fail_percentage 22690 1727204262.85529: checking to see if all hosts have failed and the running result is not ok 22690 1727204262.85530: done checking to see if all hosts have failed 22690 1727204262.85531: getting the remaining hosts for this loop 22690 1727204262.85532: done getting the remaining hosts for this loop 22690 1727204262.85537: getting the next task for host managed-node2 22690 1727204262.85543: done getting next task for host managed-node2 22690 1727204262.85548: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204262.85550: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204262.85567: getting variables 22690 1727204262.85569: in VariableManager get_vars() 22690 1727204262.85610: Calling all_inventory to load vars for managed-node2 22690 1727204262.85613: Calling groups_inventory to load vars for managed-node2 22690 1727204262.85618: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204262.85631: Calling all_plugins_play to load vars for managed-node2 22690 1727204262.85634: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204262.85638: Calling groups_plugins_play to load vars for managed-node2 22690 1727204262.86354: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000047 22690 1727204262.86358: WORKER PROCESS EXITING 22690 1727204262.87713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204262.90088: done with get_vars() 22690 1727204262.90118: done getting variables 22690 1727204262.90187: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:42 -0400 (0:00:00.170) 0:00:30.185 ***** 22690 1727204262.90223: entering _queue_task() for managed-node2/service 22690 1727204262.90612: worker is 1 (out of 1 available) 22690 1727204262.90629: exiting _queue_task() for managed-node2/service 22690 1727204262.90645: done queuing things up, now waiting for results queue to drain 22690 1727204262.90646: waiting for pending results... 22690 1727204262.91290: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204262.91295: in run() - task 127b8e07-fff9-78bb-bf56-000000000048 22690 1727204262.91299: variable 'ansible_search_path' from source: unknown 22690 1727204262.91302: variable 'ansible_search_path' from source: unknown 22690 1727204262.91375: calling self._execute() 22690 1727204262.91489: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204262.91617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204262.91771: variable 'omit' from source: magic vars 22690 1727204262.92775: variable 'ansible_distribution_major_version' from source: facts 22690 1727204262.92779: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204262.92973: variable 'network_provider' from source: set_fact 22690 1727204262.93080: variable 'network_state' from source: role '' defaults 22690 1727204262.93104: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22690 1727204262.93120: variable 'omit' from source: magic vars 22690 1727204262.93214: variable 'omit' from source: magic vars 22690 1727204262.93355: variable 'network_service_name' from source: role '' defaults 22690 1727204262.93498: variable 'network_service_name' from source: role '' defaults 22690 1727204262.93743: variable '__network_provider_setup' from source: role '' defaults 22690 1727204262.93970: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204262.93976: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204262.93979: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204262.94136: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204262.94691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204262.97430: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204262.97531: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204262.97585: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204262.97631: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204262.97660: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204262.97758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.97804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.97839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.97895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.97918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.97977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.98011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.98043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.98092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.98118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.98646: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204262.98793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204262.98829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204262.99083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204262.99087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204262.99090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204262.99269: variable 'ansible_python' from source: facts 22690 1727204262.99330: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204262.99631: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204262.99757: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204263.00239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204263.00269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204263.00295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.00451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204263.00467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204263.00524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204263.00549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204263.00575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.00759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204263.00763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204263.01085: variable 'network_connections' from source: play vars 22690 1727204263.01089: variable 'profile' from source: play vars 22690 1727204263.01091: variable 'profile' from source: play vars 22690 1727204263.01277: variable 'interface' from source: set_fact 22690 1727204263.01345: variable 'interface' from source: set_fact 22690 1727204263.01662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204263.02093: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204263.02146: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204263.02195: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204263.02238: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204263.02521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204263.02556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204263.02609: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.02629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204263.02886: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204263.03431: variable 'network_connections' from source: play vars 22690 1727204263.03438: variable 'profile' from source: play vars 22690 1727204263.03731: variable 'profile' from source: play vars 22690 1727204263.03737: variable 'interface' from source: set_fact 22690 1727204263.03809: variable 'interface' from source: set_fact 22690 1727204263.03847: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204263.04136: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204263.04864: variable 'network_connections' from source: play vars 22690 1727204263.04870: variable 'profile' from source: play vars 22690 1727204263.04945: variable 'profile' from source: play vars 22690 1727204263.04949: variable 'interface' from source: set_fact 22690 1727204263.05035: variable 'interface' from source: set_fact 22690 1727204263.05067: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204263.05352: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204263.05863: variable 'network_connections' from source: play vars 22690 1727204263.05874: variable 'profile' from source: play vars 22690 1727204263.06151: variable 'profile' from source: play vars 22690 1727204263.06154: variable 'interface' from source: set_fact 22690 1727204263.06234: variable 'interface' from source: set_fact 22690 1727204263.06507: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204263.06574: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204263.06581: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204263.06647: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204263.07288: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204263.08273: variable 'network_connections' from source: play vars 22690 1727204263.08488: variable 'profile' from source: play vars 22690 1727204263.08560: variable 'profile' from source: play vars 22690 1727204263.08564: variable 'interface' from source: set_fact 22690 1727204263.08648: variable 'interface' from source: set_fact 22690 1727204263.08658: variable 'ansible_distribution' from source: facts 22690 1727204263.08661: variable '__network_rh_distros' from source: role '' defaults 22690 1727204263.08670: variable 'ansible_distribution_major_version' from source: facts 22690 1727204263.08892: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204263.09296: variable 'ansible_distribution' from source: facts 22690 1727204263.09300: variable '__network_rh_distros' from source: role '' defaults 22690 1727204263.09390: variable 'ansible_distribution_major_version' from source: facts 22690 1727204263.09393: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204263.09723: variable 'ansible_distribution' from source: facts 22690 1727204263.09727: variable '__network_rh_distros' from source: role '' defaults 22690 1727204263.09733: variable 'ansible_distribution_major_version' from source: facts 22690 1727204263.09818: variable 'network_provider' from source: set_fact 22690 1727204263.09822: variable 'omit' from source: magic vars 22690 1727204263.09911: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204263.10085: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204263.10107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204263.10130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204263.10139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204263.10177: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204263.10180: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204263.10182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204263.10573: Set connection var ansible_connection to ssh 22690 1727204263.10577: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204263.10579: Set connection var ansible_pipelining to False 22690 1727204263.10582: Set connection var ansible_shell_type to sh 22690 1727204263.10584: Set connection var ansible_shell_executable to /bin/sh 22690 1727204263.10586: Set connection var ansible_timeout to 10 22690 1727204263.10588: variable 'ansible_shell_executable' from source: unknown 22690 1727204263.10590: variable 'ansible_connection' from source: unknown 22690 1727204263.10593: variable 'ansible_module_compression' from source: unknown 22690 1727204263.10595: variable 'ansible_shell_type' from source: unknown 22690 1727204263.10597: variable 'ansible_shell_executable' from source: unknown 22690 1727204263.10599: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204263.10606: variable 'ansible_pipelining' from source: unknown 22690 1727204263.10608: variable 'ansible_timeout' from source: unknown 22690 1727204263.10611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204263.10945: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204263.10949: variable 'omit' from source: magic vars 22690 1727204263.10951: starting attempt loop 22690 1727204263.10954: running the handler 22690 1727204263.11176: variable 'ansible_facts' from source: unknown 22690 1727204263.13525: _low_level_execute_command(): starting 22690 1727204263.13529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204263.15498: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.15603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.15733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.17537: stdout chunk (state=3): >>>/root <<< 22690 1727204263.18475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.18480: stdout chunk (state=3): >>><<< 22690 1727204263.18483: stderr chunk (state=3): >>><<< 22690 1727204263.18485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204263.18488: _low_level_execute_command(): starting 22690 1727204263.18491: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592 `" && echo ansible-tmp-1727204263.1815505-24712-177373864547592="` echo /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592 `" ) && sleep 0' 22690 1727204263.19989: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204263.20101: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204263.20126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.20148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.20305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.22224: stdout chunk (state=3): >>>ansible-tmp-1727204263.1815505-24712-177373864547592=/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592 <<< 22690 1727204263.22503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.22528: stderr chunk (state=3): >>><<< 22690 1727204263.22545: stdout chunk (state=3): >>><<< 22690 1727204263.22571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204263.1815505-24712-177373864547592=/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204263.22714: variable 'ansible_module_compression' from source: unknown 22690 1727204263.22720: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22690 1727204263.22773: variable 'ansible_facts' from source: unknown 22690 1727204263.22997: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py 22690 1727204263.23188: Sending initial data 22690 1727204263.23285: Sent initial data (156 bytes) 22690 1727204263.24281: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204263.24384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204263.24511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204263.24532: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.24554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.24701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.26300: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22690 1727204263.26344: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204263.26475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204263.26589: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpvcflhnxr /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py <<< 22690 1727204263.26604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py" <<< 22690 1727204263.26662: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpvcflhnxr" to remote "/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py" <<< 22690 1727204263.29453: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.29458: stdout chunk (state=3): >>><<< 22690 1727204263.29461: stderr chunk (state=3): >>><<< 22690 1727204263.29464: done transferring module to remote 22690 1727204263.29559: _low_level_execute_command(): starting 22690 1727204263.29563: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/ /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py && sleep 0' 22690 1727204263.30789: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204263.30849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204263.30860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.31005: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.31186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.33123: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.33225: stderr chunk (state=3): >>><<< 22690 1727204263.33229: stdout chunk (state=3): >>><<< 22690 1727204263.33287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204263.33291: _low_level_execute_command(): starting 22690 1727204263.33327: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/AnsiballZ_systemd.py && sleep 0' 22690 1727204263.34873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204263.35190: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.35235: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.35334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.67188: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3511717888", "CPUUsageNSec": "907535000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 22690 1727204263.67202: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": <<< 22690 1727204263.67327: stdout chunk (state=3): >>>"system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22690 1727204263.69010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.69448: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204263.69453: stdout chunk (state=3): >>><<< 22690 1727204263.69455: stderr chunk (state=3): >>><<< 22690 1727204263.69461: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4509696", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3511717888", "CPUUsageNSec": "907535000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204263.69735: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204263.69757: _low_level_execute_command(): starting 22690 1727204263.69760: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204263.1815505-24712-177373864547592/ > /dev/null 2>&1 && sleep 0' 22690 1727204263.71318: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204263.71445: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204263.71576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204263.73691: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204263.73695: stdout chunk (state=3): >>><<< 22690 1727204263.73698: stderr chunk (state=3): >>><<< 22690 1727204263.73701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204263.73703: handler run complete 22690 1727204263.73705: attempt loop complete, returning result 22690 1727204263.73707: _execute() done 22690 1727204263.73709: dumping result to json 22690 1727204263.73711: done dumping result, returning 22690 1727204263.73713: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-78bb-bf56-000000000048] 22690 1727204263.73718: sending task result for task 127b8e07-fff9-78bb-bf56-000000000048 22690 1727204263.74339: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000048 22690 1727204263.74343: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204263.74403: no more pending results, returning what we have 22690 1727204263.74407: results queue empty 22690 1727204263.74408: checking for any_errors_fatal 22690 1727204263.74413: done checking for any_errors_fatal 22690 1727204263.74414: checking for max_fail_percentage 22690 1727204263.74417: done checking for max_fail_percentage 22690 1727204263.74418: checking to see if all hosts have failed and the running result is not ok 22690 1727204263.74419: done checking to see if all hosts have failed 22690 1727204263.74420: getting the remaining hosts for this loop 22690 1727204263.74421: done getting the remaining hosts for this loop 22690 1727204263.74424: getting the next task for host managed-node2 22690 1727204263.74429: done getting next task for host managed-node2 22690 1727204263.74433: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204263.74435: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204263.74445: getting variables 22690 1727204263.74446: in VariableManager get_vars() 22690 1727204263.74480: Calling all_inventory to load vars for managed-node2 22690 1727204263.74483: Calling groups_inventory to load vars for managed-node2 22690 1727204263.74485: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204263.74495: Calling all_plugins_play to load vars for managed-node2 22690 1727204263.74498: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204263.74500: Calling groups_plugins_play to load vars for managed-node2 22690 1727204263.77207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204263.79437: done with get_vars() 22690 1727204263.79482: done getting variables 22690 1727204263.79552: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.893) 0:00:31.079 ***** 22690 1727204263.79587: entering _queue_task() for managed-node2/service 22690 1727204263.80018: worker is 1 (out of 1 available) 22690 1727204263.80032: exiting _queue_task() for managed-node2/service 22690 1727204263.80047: done queuing things up, now waiting for results queue to drain 22690 1727204263.80048: waiting for pending results... 22690 1727204263.80889: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204263.80897: in run() - task 127b8e07-fff9-78bb-bf56-000000000049 22690 1727204263.80901: variable 'ansible_search_path' from source: unknown 22690 1727204263.80905: variable 'ansible_search_path' from source: unknown 22690 1727204263.81475: calling self._execute() 22690 1727204263.81479: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204263.81483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204263.81486: variable 'omit' from source: magic vars 22690 1727204263.82053: variable 'ansible_distribution_major_version' from source: facts 22690 1727204263.82080: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204263.82235: variable 'network_provider' from source: set_fact 22690 1727204263.82248: Evaluated conditional (network_provider == "nm"): True 22690 1727204263.82367: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204263.82474: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204263.82670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204263.86066: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204263.86074: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204263.86194: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204263.86237: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204263.86268: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204263.86480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204263.86521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204263.86601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.86826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204263.86830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204263.87072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204263.87076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204263.87078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.87081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204263.87083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204263.87175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204263.87210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204263.87321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.87369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204263.87418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204263.87785: variable 'network_connections' from source: play vars 22690 1727204263.87848: variable 'profile' from source: play vars 22690 1727204263.87993: variable 'profile' from source: play vars 22690 1727204263.88006: variable 'interface' from source: set_fact 22690 1727204263.88086: variable 'interface' from source: set_fact 22690 1727204263.88248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204263.88521: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204263.88581: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204263.88619: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204263.88650: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204263.88708: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204263.88807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204263.88811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204263.88814: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204263.88861: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204263.89545: variable 'network_connections' from source: play vars 22690 1727204263.89557: variable 'profile' from source: play vars 22690 1727204263.89778: variable 'profile' from source: play vars 22690 1727204263.89781: variable 'interface' from source: set_fact 22690 1727204263.89970: variable 'interface' from source: set_fact 22690 1727204263.89973: Evaluated conditional (__network_wpa_supplicant_required): False 22690 1727204263.89976: when evaluation is False, skipping this task 22690 1727204263.89978: _execute() done 22690 1727204263.89987: dumping result to json 22690 1727204263.89989: done dumping result, returning 22690 1727204263.89991: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-78bb-bf56-000000000049] 22690 1727204263.89993: sending task result for task 127b8e07-fff9-78bb-bf56-000000000049 22690 1727204263.90530: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000049 22690 1727204263.90535: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22690 1727204263.90586: no more pending results, returning what we have 22690 1727204263.90589: results queue empty 22690 1727204263.90590: checking for any_errors_fatal 22690 1727204263.90610: done checking for any_errors_fatal 22690 1727204263.90611: checking for max_fail_percentage 22690 1727204263.90613: done checking for max_fail_percentage 22690 1727204263.90614: checking to see if all hosts have failed and the running result is not ok 22690 1727204263.90615: done checking to see if all hosts have failed 22690 1727204263.90616: getting the remaining hosts for this loop 22690 1727204263.90617: done getting the remaining hosts for this loop 22690 1727204263.90622: getting the next task for host managed-node2 22690 1727204263.90629: done getting next task for host managed-node2 22690 1727204263.90633: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204263.90635: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204263.90651: getting variables 22690 1727204263.90653: in VariableManager get_vars() 22690 1727204263.90700: Calling all_inventory to load vars for managed-node2 22690 1727204263.90704: Calling groups_inventory to load vars for managed-node2 22690 1727204263.90707: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204263.90719: Calling all_plugins_play to load vars for managed-node2 22690 1727204263.90723: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204263.90727: Calling groups_plugins_play to load vars for managed-node2 22690 1727204263.92919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204263.96051: done with get_vars() 22690 1727204263.96093: done getting variables 22690 1727204263.96157: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:43 -0400 (0:00:00.166) 0:00:31.245 ***** 22690 1727204263.96193: entering _queue_task() for managed-node2/service 22690 1727204263.96569: worker is 1 (out of 1 available) 22690 1727204263.96583: exiting _queue_task() for managed-node2/service 22690 1727204263.96598: done queuing things up, now waiting for results queue to drain 22690 1727204263.96600: waiting for pending results... 22690 1727204263.97003: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204263.97143: in run() - task 127b8e07-fff9-78bb-bf56-00000000004a 22690 1727204263.97172: variable 'ansible_search_path' from source: unknown 22690 1727204263.97192: variable 'ansible_search_path' from source: unknown 22690 1727204263.97302: calling self._execute() 22690 1727204263.97372: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204263.97385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204263.97399: variable 'omit' from source: magic vars 22690 1727204263.97898: variable 'ansible_distribution_major_version' from source: facts 22690 1727204263.97934: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204263.98145: variable 'network_provider' from source: set_fact 22690 1727204263.98158: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204263.98186: when evaluation is False, skipping this task 22690 1727204263.98215: _execute() done 22690 1727204263.98220: dumping result to json 22690 1727204263.98222: done dumping result, returning 22690 1727204263.98277: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-78bb-bf56-00000000004a] 22690 1727204263.98280: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004a skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204263.98620: no more pending results, returning what we have 22690 1727204263.98625: results queue empty 22690 1727204263.98626: checking for any_errors_fatal 22690 1727204263.98639: done checking for any_errors_fatal 22690 1727204263.98640: checking for max_fail_percentage 22690 1727204263.98642: done checking for max_fail_percentage 22690 1727204263.98644: checking to see if all hosts have failed and the running result is not ok 22690 1727204263.98645: done checking to see if all hosts have failed 22690 1727204263.98645: getting the remaining hosts for this loop 22690 1727204263.98647: done getting the remaining hosts for this loop 22690 1727204263.98652: getting the next task for host managed-node2 22690 1727204263.98659: done getting next task for host managed-node2 22690 1727204263.98664: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204263.98669: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204263.98689: getting variables 22690 1727204263.98691: in VariableManager get_vars() 22690 1727204263.98740: Calling all_inventory to load vars for managed-node2 22690 1727204263.98743: Calling groups_inventory to load vars for managed-node2 22690 1727204263.98746: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204263.98761: Calling all_plugins_play to load vars for managed-node2 22690 1727204263.98967: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204263.98975: Calling groups_plugins_play to load vars for managed-node2 22690 1727204263.99633: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004a 22690 1727204263.99637: WORKER PROCESS EXITING 22690 1727204264.01584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204264.04187: done with get_vars() 22690 1727204264.04228: done getting variables 22690 1727204264.04381: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.082) 0:00:31.327 ***** 22690 1727204264.04420: entering _queue_task() for managed-node2/copy 22690 1727204264.04832: worker is 1 (out of 1 available) 22690 1727204264.04845: exiting _queue_task() for managed-node2/copy 22690 1727204264.04859: done queuing things up, now waiting for results queue to drain 22690 1727204264.04861: waiting for pending results... 22690 1727204264.05200: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204264.05333: in run() - task 127b8e07-fff9-78bb-bf56-00000000004b 22690 1727204264.05358: variable 'ansible_search_path' from source: unknown 22690 1727204264.05370: variable 'ansible_search_path' from source: unknown 22690 1727204264.05421: calling self._execute() 22690 1727204264.05533: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.05546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.05563: variable 'omit' from source: magic vars 22690 1727204264.05981: variable 'ansible_distribution_major_version' from source: facts 22690 1727204264.06001: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204264.06135: variable 'network_provider' from source: set_fact 22690 1727204264.06147: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204264.06155: when evaluation is False, skipping this task 22690 1727204264.06163: _execute() done 22690 1727204264.06174: dumping result to json 22690 1727204264.06183: done dumping result, returning 22690 1727204264.06196: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-78bb-bf56-00000000004b] 22690 1727204264.06207: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004b skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22690 1727204264.06386: no more pending results, returning what we have 22690 1727204264.06391: results queue empty 22690 1727204264.06392: checking for any_errors_fatal 22690 1727204264.06397: done checking for any_errors_fatal 22690 1727204264.06398: checking for max_fail_percentage 22690 1727204264.06400: done checking for max_fail_percentage 22690 1727204264.06401: checking to see if all hosts have failed and the running result is not ok 22690 1727204264.06402: done checking to see if all hosts have failed 22690 1727204264.06403: getting the remaining hosts for this loop 22690 1727204264.06404: done getting the remaining hosts for this loop 22690 1727204264.06408: getting the next task for host managed-node2 22690 1727204264.06417: done getting next task for host managed-node2 22690 1727204264.06421: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204264.06423: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204264.06447: getting variables 22690 1727204264.06449: in VariableManager get_vars() 22690 1727204264.06743: Calling all_inventory to load vars for managed-node2 22690 1727204264.06746: Calling groups_inventory to load vars for managed-node2 22690 1727204264.06749: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204264.06764: Calling all_plugins_play to load vars for managed-node2 22690 1727204264.06769: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204264.06773: Calling groups_plugins_play to load vars for managed-node2 22690 1727204264.07314: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004b 22690 1727204264.07318: WORKER PROCESS EXITING 22690 1727204264.09691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204264.12101: done with get_vars() 22690 1727204264.12140: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.078) 0:00:31.405 ***** 22690 1727204264.12234: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204264.12646: worker is 1 (out of 1 available) 22690 1727204264.12659: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204264.12676: done queuing things up, now waiting for results queue to drain 22690 1727204264.12677: waiting for pending results... 22690 1727204264.13100: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204264.13289: in run() - task 127b8e07-fff9-78bb-bf56-00000000004c 22690 1727204264.13296: variable 'ansible_search_path' from source: unknown 22690 1727204264.13303: variable 'ansible_search_path' from source: unknown 22690 1727204264.13306: calling self._execute() 22690 1727204264.13437: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.13443: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.13540: variable 'omit' from source: magic vars 22690 1727204264.13994: variable 'ansible_distribution_major_version' from source: facts 22690 1727204264.14012: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204264.14021: variable 'omit' from source: magic vars 22690 1727204264.14069: variable 'omit' from source: magic vars 22690 1727204264.14304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204264.18039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204264.18296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204264.18300: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204264.18303: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204264.18309: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204264.18408: variable 'network_provider' from source: set_fact 22690 1727204264.18617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204264.18677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204264.18716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204264.18802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204264.18814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204264.18963: variable 'omit' from source: magic vars 22690 1727204264.19143: variable 'omit' from source: magic vars 22690 1727204264.19298: variable 'network_connections' from source: play vars 22690 1727204264.19314: variable 'profile' from source: play vars 22690 1727204264.19413: variable 'profile' from source: play vars 22690 1727204264.19417: variable 'interface' from source: set_fact 22690 1727204264.19484: variable 'interface' from source: set_fact 22690 1727204264.19647: variable 'omit' from source: magic vars 22690 1727204264.19655: variable '__lsr_ansible_managed' from source: task vars 22690 1727204264.19717: variable '__lsr_ansible_managed' from source: task vars 22690 1727204264.20074: Loaded config def from plugin (lookup/template) 22690 1727204264.20080: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22690 1727204264.20110: File lookup term: get_ansible_managed.j2 22690 1727204264.20114: variable 'ansible_search_path' from source: unknown 22690 1727204264.20122: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22690 1727204264.20137: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22690 1727204264.20156: variable 'ansible_search_path' from source: unknown 22690 1727204264.30295: variable 'ansible_managed' from source: unknown 22690 1727204264.30553: variable 'omit' from source: magic vars 22690 1727204264.30646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204264.30656: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204264.30686: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204264.30711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204264.30728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204264.30798: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204264.30850: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.30853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.31040: Set connection var ansible_connection to ssh 22690 1727204264.31072: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204264.31091: Set connection var ansible_pipelining to False 22690 1727204264.31192: Set connection var ansible_shell_type to sh 22690 1727204264.31195: Set connection var ansible_shell_executable to /bin/sh 22690 1727204264.31198: Set connection var ansible_timeout to 10 22690 1727204264.31204: variable 'ansible_shell_executable' from source: unknown 22690 1727204264.31206: variable 'ansible_connection' from source: unknown 22690 1727204264.31208: variable 'ansible_module_compression' from source: unknown 22690 1727204264.31210: variable 'ansible_shell_type' from source: unknown 22690 1727204264.31213: variable 'ansible_shell_executable' from source: unknown 22690 1727204264.31215: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.31217: variable 'ansible_pipelining' from source: unknown 22690 1727204264.31219: variable 'ansible_timeout' from source: unknown 22690 1727204264.31221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.31560: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204264.31576: variable 'omit' from source: magic vars 22690 1727204264.31579: starting attempt loop 22690 1727204264.31582: running the handler 22690 1727204264.31584: _low_level_execute_command(): starting 22690 1727204264.31587: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204264.32375: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204264.32379: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204264.32393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204264.32428: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204264.32443: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.32594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.32640: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.32782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.34543: stdout chunk (state=3): >>>/root <<< 22690 1727204264.34721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.34725: stderr chunk (state=3): >>><<< 22690 1727204264.34727: stdout chunk (state=3): >>><<< 22690 1727204264.34973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204264.34977: _low_level_execute_command(): starting 22690 1727204264.34981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778 `" && echo ansible-tmp-1727204264.3474936-24832-165319080144778="` echo /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778 `" ) && sleep 0' 22690 1727204264.35407: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204264.35423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204264.35427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204264.35442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204264.35454: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204264.35461: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204264.35474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.35488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204264.35496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204264.35503: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204264.35511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204264.35533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204264.35536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204264.35642: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204264.35645: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204264.35648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.35651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204264.35654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.35672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.35782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.37774: stdout chunk (state=3): >>>ansible-tmp-1727204264.3474936-24832-165319080144778=/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778 <<< 22690 1727204264.37885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.37996: stderr chunk (state=3): >>><<< 22690 1727204264.38001: stdout chunk (state=3): >>><<< 22690 1727204264.38076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204264.3474936-24832-165319080144778=/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204264.38085: variable 'ansible_module_compression' from source: unknown 22690 1727204264.38136: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22690 1727204264.38213: variable 'ansible_facts' from source: unknown 22690 1727204264.38354: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py 22690 1727204264.38601: Sending initial data 22690 1727204264.38604: Sent initial data (168 bytes) 22690 1727204264.39294: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204264.39336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.39401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.39444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.39534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.41157: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204264.41260: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204264.41355: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp3etip988 /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py <<< 22690 1727204264.41358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py" <<< 22690 1727204264.41420: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp3etip988" to remote "/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py" <<< 22690 1727204264.42917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.42922: stdout chunk (state=3): >>><<< 22690 1727204264.42925: stderr chunk (state=3): >>><<< 22690 1727204264.42927: done transferring module to remote 22690 1727204264.42930: _low_level_execute_command(): starting 22690 1727204264.42933: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/ /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py && sleep 0' 22690 1727204264.43688: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.43693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.43726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.43743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.43851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.45797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.45801: stdout chunk (state=3): >>><<< 22690 1727204264.45804: stderr chunk (state=3): >>><<< 22690 1727204264.45825: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204264.45840: _low_level_execute_command(): starting 22690 1727204264.45879: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/AnsiballZ_network_connections.py && sleep 0' 22690 1727204264.46673: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204264.46835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.46845: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.46919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.79377: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22690 1727204264.81347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.81358: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204264.81576: stderr chunk (state=3): >>><<< 22690 1727204264.81580: stdout chunk (state=3): >>><<< 22690 1727204264.81583: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204264.81586: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204264.81588: _low_level_execute_command(): starting 22690 1727204264.81591: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204264.3474936-24832-165319080144778/ > /dev/null 2>&1 && sleep 0' 22690 1727204264.82206: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204264.82223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204264.82241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204264.82262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204264.82281: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204264.82291: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204264.82302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.82320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204264.82331: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204264.82347: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204264.82388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204264.82488: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204264.82506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204264.82528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204264.82630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204264.84693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204264.84699: stderr chunk (state=3): >>><<< 22690 1727204264.84702: stdout chunk (state=3): >>><<< 22690 1727204264.84708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204264.84711: handler run complete 22690 1727204264.84747: attempt loop complete, returning result 22690 1727204264.84755: _execute() done 22690 1727204264.84762: dumping result to json 22690 1727204264.84776: done dumping result, returning 22690 1727204264.84791: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-78bb-bf56-00000000004c] 22690 1727204264.84802: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004c changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22690 1727204264.85289: no more pending results, returning what we have 22690 1727204264.85294: results queue empty 22690 1727204264.85295: checking for any_errors_fatal 22690 1727204264.85305: done checking for any_errors_fatal 22690 1727204264.85306: checking for max_fail_percentage 22690 1727204264.85307: done checking for max_fail_percentage 22690 1727204264.85308: checking to see if all hosts have failed and the running result is not ok 22690 1727204264.85309: done checking to see if all hosts have failed 22690 1727204264.85310: getting the remaining hosts for this loop 22690 1727204264.85312: done getting the remaining hosts for this loop 22690 1727204264.85319: getting the next task for host managed-node2 22690 1727204264.85326: done getting next task for host managed-node2 22690 1727204264.85331: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204264.85333: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204264.85345: getting variables 22690 1727204264.85347: in VariableManager get_vars() 22690 1727204264.85392: Calling all_inventory to load vars for managed-node2 22690 1727204264.85395: Calling groups_inventory to load vars for managed-node2 22690 1727204264.85398: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204264.85578: Calling all_plugins_play to load vars for managed-node2 22690 1727204264.85582: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204264.85586: Calling groups_plugins_play to load vars for managed-node2 22690 1727204264.86284: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004c 22690 1727204264.86288: WORKER PROCESS EXITING 22690 1727204264.87542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204264.89758: done with get_vars() 22690 1727204264.89797: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.776) 0:00:32.182 ***** 22690 1727204264.89897: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204264.90657: worker is 1 (out of 1 available) 22690 1727204264.90676: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204264.90691: done queuing things up, now waiting for results queue to drain 22690 1727204264.90692: waiting for pending results... 22690 1727204264.91184: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204264.91419: in run() - task 127b8e07-fff9-78bb-bf56-00000000004d 22690 1727204264.91442: variable 'ansible_search_path' from source: unknown 22690 1727204264.91448: variable 'ansible_search_path' from source: unknown 22690 1727204264.91492: calling self._execute() 22690 1727204264.91600: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.91616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.91635: variable 'omit' from source: magic vars 22690 1727204264.92022: variable 'ansible_distribution_major_version' from source: facts 22690 1727204264.92039: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204264.92175: variable 'network_state' from source: role '' defaults 22690 1727204264.92191: Evaluated conditional (network_state != {}): False 22690 1727204264.92198: when evaluation is False, skipping this task 22690 1727204264.92204: _execute() done 22690 1727204264.92211: dumping result to json 22690 1727204264.92217: done dumping result, returning 22690 1727204264.92228: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-78bb-bf56-00000000004d] 22690 1727204264.92237: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204264.92401: no more pending results, returning what we have 22690 1727204264.92406: results queue empty 22690 1727204264.92407: checking for any_errors_fatal 22690 1727204264.92421: done checking for any_errors_fatal 22690 1727204264.92422: checking for max_fail_percentage 22690 1727204264.92424: done checking for max_fail_percentage 22690 1727204264.92425: checking to see if all hosts have failed and the running result is not ok 22690 1727204264.92426: done checking to see if all hosts have failed 22690 1727204264.92427: getting the remaining hosts for this loop 22690 1727204264.92428: done getting the remaining hosts for this loop 22690 1727204264.92433: getting the next task for host managed-node2 22690 1727204264.92439: done getting next task for host managed-node2 22690 1727204264.92443: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204264.92446: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204264.92464: getting variables 22690 1727204264.92468: in VariableManager get_vars() 22690 1727204264.92509: Calling all_inventory to load vars for managed-node2 22690 1727204264.92511: Calling groups_inventory to load vars for managed-node2 22690 1727204264.92514: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204264.92530: Calling all_plugins_play to load vars for managed-node2 22690 1727204264.92533: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204264.92536: Calling groups_plugins_play to load vars for managed-node2 22690 1727204264.93086: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004d 22690 1727204264.93091: WORKER PROCESS EXITING 22690 1727204264.93960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204264.95368: done with get_vars() 22690 1727204264.95397: done getting variables 22690 1727204264.95450: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:44 -0400 (0:00:00.055) 0:00:32.238 ***** 22690 1727204264.95479: entering _queue_task() for managed-node2/debug 22690 1727204264.95767: worker is 1 (out of 1 available) 22690 1727204264.95782: exiting _queue_task() for managed-node2/debug 22690 1727204264.95797: done queuing things up, now waiting for results queue to drain 22690 1727204264.95798: waiting for pending results... 22690 1727204264.95988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204264.96066: in run() - task 127b8e07-fff9-78bb-bf56-00000000004e 22690 1727204264.96083: variable 'ansible_search_path' from source: unknown 22690 1727204264.96086: variable 'ansible_search_path' from source: unknown 22690 1727204264.96124: calling self._execute() 22690 1727204264.96210: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.96219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.96228: variable 'omit' from source: magic vars 22690 1727204264.96527: variable 'ansible_distribution_major_version' from source: facts 22690 1727204264.96538: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204264.96544: variable 'omit' from source: magic vars 22690 1727204264.96582: variable 'omit' from source: magic vars 22690 1727204264.96611: variable 'omit' from source: magic vars 22690 1727204264.96649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204264.96686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204264.96704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204264.96750: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204264.96753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204264.96774: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204264.96777: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.96780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.96972: Set connection var ansible_connection to ssh 22690 1727204264.96976: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204264.96978: Set connection var ansible_pipelining to False 22690 1727204264.96981: Set connection var ansible_shell_type to sh 22690 1727204264.96983: Set connection var ansible_shell_executable to /bin/sh 22690 1727204264.96985: Set connection var ansible_timeout to 10 22690 1727204264.96987: variable 'ansible_shell_executable' from source: unknown 22690 1727204264.96989: variable 'ansible_connection' from source: unknown 22690 1727204264.96992: variable 'ansible_module_compression' from source: unknown 22690 1727204264.96994: variable 'ansible_shell_type' from source: unknown 22690 1727204264.97015: variable 'ansible_shell_executable' from source: unknown 22690 1727204264.97024: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204264.97042: variable 'ansible_pipelining' from source: unknown 22690 1727204264.97049: variable 'ansible_timeout' from source: unknown 22690 1727204264.97082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204264.97240: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204264.97257: variable 'omit' from source: magic vars 22690 1727204264.97268: starting attempt loop 22690 1727204264.97300: running the handler 22690 1727204264.97447: variable '__network_connections_result' from source: set_fact 22690 1727204264.97518: handler run complete 22690 1727204264.97546: attempt loop complete, returning result 22690 1727204264.97623: _execute() done 22690 1727204264.97627: dumping result to json 22690 1727204264.97630: done dumping result, returning 22690 1727204264.97633: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000004e] 22690 1727204264.97635: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004e 22690 1727204264.97877: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004e 22690 1727204264.97880: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 22690 1727204264.97948: no more pending results, returning what we have 22690 1727204264.97953: results queue empty 22690 1727204264.97954: checking for any_errors_fatal 22690 1727204264.97962: done checking for any_errors_fatal 22690 1727204264.97963: checking for max_fail_percentage 22690 1727204264.97967: done checking for max_fail_percentage 22690 1727204264.97969: checking to see if all hosts have failed and the running result is not ok 22690 1727204264.97970: done checking to see if all hosts have failed 22690 1727204264.97971: getting the remaining hosts for this loop 22690 1727204264.97972: done getting the remaining hosts for this loop 22690 1727204264.97977: getting the next task for host managed-node2 22690 1727204264.97984: done getting next task for host managed-node2 22690 1727204264.97987: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204264.97990: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204264.98002: getting variables 22690 1727204264.98004: in VariableManager get_vars() 22690 1727204264.98046: Calling all_inventory to load vars for managed-node2 22690 1727204264.98049: Calling groups_inventory to load vars for managed-node2 22690 1727204264.98051: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204264.98064: Calling all_plugins_play to load vars for managed-node2 22690 1727204264.98250: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204264.98256: Calling groups_plugins_play to load vars for managed-node2 22690 1727204264.99280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.00540: done with get_vars() 22690 1727204265.00574: done getting variables 22690 1727204265.00648: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.052) 0:00:32.290 ***** 22690 1727204265.00685: entering _queue_task() for managed-node2/debug 22690 1727204265.01088: worker is 1 (out of 1 available) 22690 1727204265.01104: exiting _queue_task() for managed-node2/debug 22690 1727204265.01119: done queuing things up, now waiting for results queue to drain 22690 1727204265.01121: waiting for pending results... 22690 1727204265.01406: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204265.01539: in run() - task 127b8e07-fff9-78bb-bf56-00000000004f 22690 1727204265.01567: variable 'ansible_search_path' from source: unknown 22690 1727204265.01575: variable 'ansible_search_path' from source: unknown 22690 1727204265.01628: calling self._execute() 22690 1727204265.01754: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.01761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.01775: variable 'omit' from source: magic vars 22690 1727204265.02111: variable 'ansible_distribution_major_version' from source: facts 22690 1727204265.02122: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204265.02129: variable 'omit' from source: magic vars 22690 1727204265.02167: variable 'omit' from source: magic vars 22690 1727204265.02370: variable 'omit' from source: magic vars 22690 1727204265.02374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204265.02377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204265.02380: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204265.02384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.02387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.02394: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204265.02403: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.02411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.02518: Set connection var ansible_connection to ssh 22690 1727204265.02537: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204265.02550: Set connection var ansible_pipelining to False 22690 1727204265.02558: Set connection var ansible_shell_type to sh 22690 1727204265.02569: Set connection var ansible_shell_executable to /bin/sh 22690 1727204265.02583: Set connection var ansible_timeout to 10 22690 1727204265.02613: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.02626: variable 'ansible_connection' from source: unknown 22690 1727204265.02635: variable 'ansible_module_compression' from source: unknown 22690 1727204265.02642: variable 'ansible_shell_type' from source: unknown 22690 1727204265.02650: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.02657: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.02667: variable 'ansible_pipelining' from source: unknown 22690 1727204265.02676: variable 'ansible_timeout' from source: unknown 22690 1727204265.02683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.02851: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204265.02873: variable 'omit' from source: magic vars 22690 1727204265.02883: starting attempt loop 22690 1727204265.02890: running the handler 22690 1727204265.02950: variable '__network_connections_result' from source: set_fact 22690 1727204265.03043: variable '__network_connections_result' from source: set_fact 22690 1727204265.03169: handler run complete 22690 1727204265.03206: attempt loop complete, returning result 22690 1727204265.03272: _execute() done 22690 1727204265.03275: dumping result to json 22690 1727204265.03278: done dumping result, returning 22690 1727204265.03281: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000004f] 22690 1727204265.03284: sending task result for task 127b8e07-fff9-78bb-bf56-00000000004f 22690 1727204265.03377: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000004f 22690 1727204265.03382: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22690 1727204265.03479: no more pending results, returning what we have 22690 1727204265.03484: results queue empty 22690 1727204265.03484: checking for any_errors_fatal 22690 1727204265.03490: done checking for any_errors_fatal 22690 1727204265.03495: checking for max_fail_percentage 22690 1727204265.03497: done checking for max_fail_percentage 22690 1727204265.03498: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.03499: done checking to see if all hosts have failed 22690 1727204265.03500: getting the remaining hosts for this loop 22690 1727204265.03501: done getting the remaining hosts for this loop 22690 1727204265.03506: getting the next task for host managed-node2 22690 1727204265.03513: done getting next task for host managed-node2 22690 1727204265.03519: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204265.03521: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.03532: getting variables 22690 1727204265.03533: in VariableManager get_vars() 22690 1727204265.03577: Calling all_inventory to load vars for managed-node2 22690 1727204265.03605: Calling groups_inventory to load vars for managed-node2 22690 1727204265.03609: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.03623: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.03626: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.03629: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.09221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.10882: done with get_vars() 22690 1727204265.10929: done getting variables 22690 1727204265.11000: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.103) 0:00:32.393 ***** 22690 1727204265.11045: entering _queue_task() for managed-node2/debug 22690 1727204265.11478: worker is 1 (out of 1 available) 22690 1727204265.11492: exiting _queue_task() for managed-node2/debug 22690 1727204265.11507: done queuing things up, now waiting for results queue to drain 22690 1727204265.11509: waiting for pending results... 22690 1727204265.11828: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204265.12023: in run() - task 127b8e07-fff9-78bb-bf56-000000000050 22690 1727204265.12029: variable 'ansible_search_path' from source: unknown 22690 1727204265.12033: variable 'ansible_search_path' from source: unknown 22690 1727204265.12035: calling self._execute() 22690 1727204265.12178: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.12183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.12206: variable 'omit' from source: magic vars 22690 1727204265.12718: variable 'ansible_distribution_major_version' from source: facts 22690 1727204265.12757: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204265.12908: variable 'network_state' from source: role '' defaults 22690 1727204265.12921: Evaluated conditional (network_state != {}): False 22690 1727204265.12926: when evaluation is False, skipping this task 22690 1727204265.12931: _execute() done 22690 1727204265.12938: dumping result to json 22690 1727204265.12941: done dumping result, returning 22690 1727204265.12945: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-78bb-bf56-000000000050] 22690 1727204265.12950: sending task result for task 127b8e07-fff9-78bb-bf56-000000000050 22690 1727204265.13090: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000050 22690 1727204265.13095: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22690 1727204265.13181: no more pending results, returning what we have 22690 1727204265.13185: results queue empty 22690 1727204265.13186: checking for any_errors_fatal 22690 1727204265.13199: done checking for any_errors_fatal 22690 1727204265.13200: checking for max_fail_percentage 22690 1727204265.13202: done checking for max_fail_percentage 22690 1727204265.13203: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.13204: done checking to see if all hosts have failed 22690 1727204265.13205: getting the remaining hosts for this loop 22690 1727204265.13206: done getting the remaining hosts for this loop 22690 1727204265.13211: getting the next task for host managed-node2 22690 1727204265.13216: done getting next task for host managed-node2 22690 1727204265.13221: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204265.13223: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.13239: getting variables 22690 1727204265.13240: in VariableManager get_vars() 22690 1727204265.13283: Calling all_inventory to load vars for managed-node2 22690 1727204265.13285: Calling groups_inventory to load vars for managed-node2 22690 1727204265.13287: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.13298: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.13300: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.13303: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.14584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.16031: done with get_vars() 22690 1727204265.16058: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.051) 0:00:32.445 ***** 22690 1727204265.16167: entering _queue_task() for managed-node2/ping 22690 1727204265.16575: worker is 1 (out of 1 available) 22690 1727204265.16589: exiting _queue_task() for managed-node2/ping 22690 1727204265.16604: done queuing things up, now waiting for results queue to drain 22690 1727204265.16606: waiting for pending results... 22690 1727204265.17068: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204265.17075: in run() - task 127b8e07-fff9-78bb-bf56-000000000051 22690 1727204265.17078: variable 'ansible_search_path' from source: unknown 22690 1727204265.17082: variable 'ansible_search_path' from source: unknown 22690 1727204265.17095: calling self._execute() 22690 1727204265.17217: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.17222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.17266: variable 'omit' from source: magic vars 22690 1727204265.17810: variable 'ansible_distribution_major_version' from source: facts 22690 1727204265.17817: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204265.17821: variable 'omit' from source: magic vars 22690 1727204265.17859: variable 'omit' from source: magic vars 22690 1727204265.17958: variable 'omit' from source: magic vars 22690 1727204265.18003: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204265.18270: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204265.18384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204265.18395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.18399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.18517: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204265.18531: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.18572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.19005: Set connection var ansible_connection to ssh 22690 1727204265.19013: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204265.19017: Set connection var ansible_pipelining to False 22690 1727204265.19020: Set connection var ansible_shell_type to sh 22690 1727204265.19022: Set connection var ansible_shell_executable to /bin/sh 22690 1727204265.19237: Set connection var ansible_timeout to 10 22690 1727204265.19307: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.19316: variable 'ansible_connection' from source: unknown 22690 1727204265.19321: variable 'ansible_module_compression' from source: unknown 22690 1727204265.19326: variable 'ansible_shell_type' from source: unknown 22690 1727204265.19330: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.19333: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.19336: variable 'ansible_pipelining' from source: unknown 22690 1727204265.19340: variable 'ansible_timeout' from source: unknown 22690 1727204265.19343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.20083: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204265.20089: variable 'omit' from source: magic vars 22690 1727204265.20092: starting attempt loop 22690 1727204265.20094: running the handler 22690 1727204265.20097: _low_level_execute_command(): starting 22690 1727204265.20099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204265.21124: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204265.21144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204265.21185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.21210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204265.21223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.21323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.23042: stdout chunk (state=3): >>>/root <<< 22690 1727204265.23164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204265.23231: stderr chunk (state=3): >>><<< 22690 1727204265.23237: stdout chunk (state=3): >>><<< 22690 1727204265.23263: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204265.23300: _low_level_execute_command(): starting 22690 1727204265.23416: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788 `" && echo ansible-tmp-1727204265.232805-24925-2512593014788="` echo /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788 `" ) && sleep 0' 22690 1727204265.24003: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204265.24007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.24009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204265.24031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.24064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204265.24081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.24161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.26220: stdout chunk (state=3): >>>ansible-tmp-1727204265.232805-24925-2512593014788=/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788 <<< 22690 1727204265.26373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204265.26435: stderr chunk (state=3): >>><<< 22690 1727204265.26439: stdout chunk (state=3): >>><<< 22690 1727204265.26458: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204265.232805-24925-2512593014788=/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204265.26512: variable 'ansible_module_compression' from source: unknown 22690 1727204265.26561: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22690 1727204265.26605: variable 'ansible_facts' from source: unknown 22690 1727204265.26659: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py 22690 1727204265.26833: Sending initial data 22690 1727204265.26837: Sent initial data (150 bytes) 22690 1727204265.27486: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.27552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204265.27559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204265.27581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.27688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.29450: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204265.29521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204265.29594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmptfy2qcso /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py <<< 22690 1727204265.29599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py" <<< 22690 1727204265.29782: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmptfy2qcso" to remote "/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py" <<< 22690 1727204265.30901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204265.30990: stderr chunk (state=3): >>><<< 22690 1727204265.30999: stdout chunk (state=3): >>><<< 22690 1727204265.31032: done transferring module to remote 22690 1727204265.31048: _low_level_execute_command(): starting 22690 1727204265.31057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/ /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py && sleep 0' 22690 1727204265.31703: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204265.31722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204265.31738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204265.31750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204265.31762: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.31838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204265.31899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.31993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.33844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204265.33929: stderr chunk (state=3): >>><<< 22690 1727204265.33933: stdout chunk (state=3): >>><<< 22690 1727204265.33947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204265.33950: _low_level_execute_command(): starting 22690 1727204265.33955: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/AnsiballZ_ping.py && sleep 0' 22690 1727204265.34534: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204265.34538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204265.34541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.34544: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204265.34546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204265.34614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204265.34620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.34738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.51121: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22690 1727204265.52390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204265.52460: stderr chunk (state=3): >>><<< 22690 1727204265.52481: stdout chunk (state=3): >>><<< 22690 1727204265.52576: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204265.52580: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204265.52584: _low_level_execute_command(): starting 22690 1727204265.52587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204265.232805-24925-2512593014788/ > /dev/null 2>&1 && sleep 0' 22690 1727204265.53252: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204265.53280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204265.53390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204265.53416: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204265.53516: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204265.55551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204265.55577: stderr chunk (state=3): >>><<< 22690 1727204265.55581: stdout chunk (state=3): >>><<< 22690 1727204265.55599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204265.55607: handler run complete 22690 1727204265.55632: attempt loop complete, returning result 22690 1727204265.55636: _execute() done 22690 1727204265.55639: dumping result to json 22690 1727204265.55641: done dumping result, returning 22690 1727204265.55648: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-78bb-bf56-000000000051] 22690 1727204265.55650: sending task result for task 127b8e07-fff9-78bb-bf56-000000000051 22690 1727204265.55745: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000051 22690 1727204265.55748: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 22690 1727204265.55817: no more pending results, returning what we have 22690 1727204265.55821: results queue empty 22690 1727204265.55822: checking for any_errors_fatal 22690 1727204265.55829: done checking for any_errors_fatal 22690 1727204265.55830: checking for max_fail_percentage 22690 1727204265.55831: done checking for max_fail_percentage 22690 1727204265.55832: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.55833: done checking to see if all hosts have failed 22690 1727204265.55834: getting the remaining hosts for this loop 22690 1727204265.55835: done getting the remaining hosts for this loop 22690 1727204265.55840: getting the next task for host managed-node2 22690 1727204265.55848: done getting next task for host managed-node2 22690 1727204265.55850: ^ task is: TASK: meta (role_complete) 22690 1727204265.55852: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.55861: getting variables 22690 1727204265.55863: in VariableManager get_vars() 22690 1727204265.55909: Calling all_inventory to load vars for managed-node2 22690 1727204265.55912: Calling groups_inventory to load vars for managed-node2 22690 1727204265.55914: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.55924: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.55927: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.55930: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.56989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.60239: done with get_vars() 22690 1727204265.60285: done getting variables 22690 1727204265.60404: done queuing things up, now waiting for results queue to drain 22690 1727204265.60407: results queue empty 22690 1727204265.60407: checking for any_errors_fatal 22690 1727204265.60412: done checking for any_errors_fatal 22690 1727204265.60412: checking for max_fail_percentage 22690 1727204265.60414: done checking for max_fail_percentage 22690 1727204265.60414: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.60415: done checking to see if all hosts have failed 22690 1727204265.60416: getting the remaining hosts for this loop 22690 1727204265.60417: done getting the remaining hosts for this loop 22690 1727204265.60420: getting the next task for host managed-node2 22690 1727204265.60424: done getting next task for host managed-node2 22690 1727204265.60426: ^ task is: TASK: meta (flush_handlers) 22690 1727204265.60428: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.60431: getting variables 22690 1727204265.60432: in VariableManager get_vars() 22690 1727204265.60449: Calling all_inventory to load vars for managed-node2 22690 1727204265.60451: Calling groups_inventory to load vars for managed-node2 22690 1727204265.60454: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.60460: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.60463: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.60468: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.61913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.64394: done with get_vars() 22690 1727204265.64437: done getting variables 22690 1727204265.64526: in VariableManager get_vars() 22690 1727204265.64541: Calling all_inventory to load vars for managed-node2 22690 1727204265.64543: Calling groups_inventory to load vars for managed-node2 22690 1727204265.64545: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.64551: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.64554: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.64557: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.66359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.70590: done with get_vars() 22690 1727204265.70644: done queuing things up, now waiting for results queue to drain 22690 1727204265.70647: results queue empty 22690 1727204265.70648: checking for any_errors_fatal 22690 1727204265.70649: done checking for any_errors_fatal 22690 1727204265.70650: checking for max_fail_percentage 22690 1727204265.70651: done checking for max_fail_percentage 22690 1727204265.70652: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.70652: done checking to see if all hosts have failed 22690 1727204265.70653: getting the remaining hosts for this loop 22690 1727204265.70654: done getting the remaining hosts for this loop 22690 1727204265.70657: getting the next task for host managed-node2 22690 1727204265.70662: done getting next task for host managed-node2 22690 1727204265.70663: ^ task is: TASK: meta (flush_handlers) 22690 1727204265.70859: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.70869: getting variables 22690 1727204265.70870: in VariableManager get_vars() 22690 1727204265.70887: Calling all_inventory to load vars for managed-node2 22690 1727204265.70890: Calling groups_inventory to load vars for managed-node2 22690 1727204265.70892: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.70899: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.70901: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.70904: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.74276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.77550: done with get_vars() 22690 1727204265.77587: done getting variables 22690 1727204265.77654: in VariableManager get_vars() 22690 1727204265.77673: Calling all_inventory to load vars for managed-node2 22690 1727204265.77675: Calling groups_inventory to load vars for managed-node2 22690 1727204265.77677: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.77717: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.77722: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.77726: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.80101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.84274: done with get_vars() 22690 1727204265.84437: done queuing things up, now waiting for results queue to drain 22690 1727204265.84440: results queue empty 22690 1727204265.84441: checking for any_errors_fatal 22690 1727204265.84442: done checking for any_errors_fatal 22690 1727204265.84443: checking for max_fail_percentage 22690 1727204265.84444: done checking for max_fail_percentage 22690 1727204265.84445: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.84446: done checking to see if all hosts have failed 22690 1727204265.84447: getting the remaining hosts for this loop 22690 1727204265.84448: done getting the remaining hosts for this loop 22690 1727204265.84451: getting the next task for host managed-node2 22690 1727204265.84455: done getting next task for host managed-node2 22690 1727204265.84456: ^ task is: None 22690 1727204265.84458: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.84459: done queuing things up, now waiting for results queue to drain 22690 1727204265.84460: results queue empty 22690 1727204265.84461: checking for any_errors_fatal 22690 1727204265.84461: done checking for any_errors_fatal 22690 1727204265.84462: checking for max_fail_percentage 22690 1727204265.84463: done checking for max_fail_percentage 22690 1727204265.84464: checking to see if all hosts have failed and the running result is not ok 22690 1727204265.84465: done checking to see if all hosts have failed 22690 1727204265.84570: getting the next task for host managed-node2 22690 1727204265.84574: done getting next task for host managed-node2 22690 1727204265.84575: ^ task is: None 22690 1727204265.84576: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.84648: in VariableManager get_vars() 22690 1727204265.84671: done with get_vars() 22690 1727204265.84678: in VariableManager get_vars() 22690 1727204265.84688: done with get_vars() 22690 1727204265.84693: variable 'omit' from source: magic vars 22690 1727204265.84727: in VariableManager get_vars() 22690 1727204265.84854: done with get_vars() 22690 1727204265.84882: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 22690 1727204265.85611: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204265.85782: getting the remaining hosts for this loop 22690 1727204265.85784: done getting the remaining hosts for this loop 22690 1727204265.85786: getting the next task for host managed-node2 22690 1727204265.85789: done getting next task for host managed-node2 22690 1727204265.85791: ^ task is: TASK: Gathering Facts 22690 1727204265.85793: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204265.85795: getting variables 22690 1727204265.85796: in VariableManager get_vars() 22690 1727204265.85805: Calling all_inventory to load vars for managed-node2 22690 1727204265.85807: Calling groups_inventory to load vars for managed-node2 22690 1727204265.85810: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204265.85818: Calling all_plugins_play to load vars for managed-node2 22690 1727204265.85821: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204265.85829: Calling groups_plugins_play to load vars for managed-node2 22690 1727204265.89191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204265.94190: done with get_vars() 22690 1727204265.94229: done getting variables 22690 1727204265.94424: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.782) 0:00:33.227 ***** 22690 1727204265.94454: entering _queue_task() for managed-node2/gather_facts 22690 1727204265.95287: worker is 1 (out of 1 available) 22690 1727204265.95373: exiting _queue_task() for managed-node2/gather_facts 22690 1727204265.95418: done queuing things up, now waiting for results queue to drain 22690 1727204265.95421: waiting for pending results... 22690 1727204265.95924: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204265.96374: in run() - task 127b8e07-fff9-78bb-bf56-0000000003f8 22690 1727204265.96379: variable 'ansible_search_path' from source: unknown 22690 1727204265.96382: calling self._execute() 22690 1727204265.96771: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.96776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.96779: variable 'omit' from source: magic vars 22690 1727204265.97579: variable 'ansible_distribution_major_version' from source: facts 22690 1727204265.97807: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204265.97824: variable 'omit' from source: magic vars 22690 1727204265.97864: variable 'omit' from source: magic vars 22690 1727204265.98271: variable 'omit' from source: magic vars 22690 1727204265.98275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204265.98279: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204265.98282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204265.98285: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.98288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204265.98643: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204265.98647: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.98651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.98653: Set connection var ansible_connection to ssh 22690 1727204265.98657: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204265.98659: Set connection var ansible_pipelining to False 22690 1727204265.98661: Set connection var ansible_shell_type to sh 22690 1727204265.98664: Set connection var ansible_shell_executable to /bin/sh 22690 1727204265.98710: Set connection var ansible_timeout to 10 22690 1727204265.98933: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.98943: variable 'ansible_connection' from source: unknown 22690 1727204265.98951: variable 'ansible_module_compression' from source: unknown 22690 1727204265.98958: variable 'ansible_shell_type' from source: unknown 22690 1727204265.98970: variable 'ansible_shell_executable' from source: unknown 22690 1727204265.98979: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204265.98989: variable 'ansible_pipelining' from source: unknown 22690 1727204265.98997: variable 'ansible_timeout' from source: unknown 22690 1727204265.99009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204265.99357: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204265.99532: variable 'omit' from source: magic vars 22690 1727204265.99546: starting attempt loop 22690 1727204265.99556: running the handler 22690 1727204265.99584: variable 'ansible_facts' from source: unknown 22690 1727204265.99617: _low_level_execute_command(): starting 22690 1727204265.99635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204266.01269: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204266.01420: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.01470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204266.01709: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.03456: stdout chunk (state=3): >>>/root <<< 22690 1727204266.03562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204266.03649: stderr chunk (state=3): >>><<< 22690 1727204266.03666: stdout chunk (state=3): >>><<< 22690 1727204266.03877: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204266.03881: _low_level_execute_command(): starting 22690 1727204266.03884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254 `" && echo ansible-tmp-1727204266.0378842-25035-3476913320254="` echo /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254 `" ) && sleep 0' 22690 1727204266.05158: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204266.05264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204266.05497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.05511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204266.05536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204266.05562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204266.05789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.07738: stdout chunk (state=3): >>>ansible-tmp-1727204266.0378842-25035-3476913320254=/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254 <<< 22690 1727204266.07992: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204266.08001: stdout chunk (state=3): >>><<< 22690 1727204266.08004: stderr chunk (state=3): >>><<< 22690 1727204266.08282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204266.0378842-25035-3476913320254=/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204266.08290: variable 'ansible_module_compression' from source: unknown 22690 1727204266.08293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204266.08536: variable 'ansible_facts' from source: unknown 22690 1727204266.08983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py 22690 1727204266.09412: Sending initial data 22690 1727204266.09419: Sent initial data (152 bytes) 22690 1727204266.11130: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204266.11389: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204266.11491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.13079: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204266.13173: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204266.13407: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp7944qc4x /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py <<< 22690 1727204266.13418: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py" <<< 22690 1727204266.13470: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp7944qc4x" to remote "/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py" <<< 22690 1727204266.16597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204266.16623: stdout chunk (state=3): >>><<< 22690 1727204266.16637: stderr chunk (state=3): >>><<< 22690 1727204266.16677: done transferring module to remote 22690 1727204266.16797: _low_level_execute_command(): starting 22690 1727204266.16818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/ /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py && sleep 0' 22690 1727204266.18297: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204266.18303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.18309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204266.18314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.18686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.20473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204266.20570: stderr chunk (state=3): >>><<< 22690 1727204266.20575: stdout chunk (state=3): >>><<< 22690 1727204266.20595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204266.20610: _low_level_execute_command(): starting 22690 1727204266.20624: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/AnsiballZ_setup.py && sleep 0' 22690 1727204266.22191: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.22233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204266.22251: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204266.22403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204266.22513: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.89076: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3057, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 659, "free": 3057}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 613, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316568064, "block_size": 4096, "block_total": 64479564, "block_available": 61356584, "block_used": 3122980, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5322265625, "5m": 0.51318359375, "15m": 0.283203125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "46", "epoch": "1727204266", "epoch_int": "1727204266", "date": "2024-09-24", "time": "14:57:46", "iso8601_micro": "2024-09-24T18:57:46.845289Z", "iso8601": "2024-09-24T18:57:46Z", "iso8601_basic": "20240924T145746845289", "iso8601_basic_short": "20240924T145746", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lsr27", "peerlsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204266.91105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204266.91109: stdout chunk (state=3): >>><<< 22690 1727204266.91113: stderr chunk (state=3): >>><<< 22690 1727204266.91174: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3057, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 659, "free": 3057}, "nocache": {"free": 3489, "used": 227}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 613, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316568064, "block_size": 4096, "block_total": 64479564, "block_available": 61356584, "block_used": 3122980, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.5322265625, "5m": 0.51318359375, "15m": 0.283203125}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "46", "epoch": "1727204266", "epoch_int": "1727204266", "date": "2024-09-24", "time": "14:57:46", "iso8601_micro": "2024-09-24T18:57:46.845289Z", "iso8601": "2024-09-24T18:57:46Z", "iso8601_basic": "20240924T145746845289", "iso8601_basic_short": "20240924T145746", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lsr27", "peerlsr27", "lo"], "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "46:df:a1:d6:4d:5c", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::44df:a1ff:fed6:4d5c", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "4a:30:fc:5e:2c:a4", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::4830:fcff:fe5e:2ca4", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::44df:a1ff:fed6:4d5c", "fe80::f7:13ff:fe22:8fc1", "fe80::4830:fcff:fe5e:2ca4"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1", "fe80::44df:a1ff:fed6:4d5c", "fe80::4830:fcff:fe5e:2ca4"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204266.92244: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204266.92563: _low_level_execute_command(): starting 22690 1727204266.92570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204266.0378842-25035-3476913320254/ > /dev/null 2>&1 && sleep 0' 22690 1727204266.93784: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204266.93788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.93791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204266.93794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204266.93885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204266.94182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204266.96374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204266.96378: stdout chunk (state=3): >>><<< 22690 1727204266.96381: stderr chunk (state=3): >>><<< 22690 1727204266.96384: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204266.96386: handler run complete 22690 1727204266.96481: variable 'ansible_facts' from source: unknown 22690 1727204266.96774: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204266.98025: variable 'ansible_facts' from source: unknown 22690 1727204266.98134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204266.98725: attempt loop complete, returning result 22690 1727204266.99173: _execute() done 22690 1727204266.99177: dumping result to json 22690 1727204266.99180: done dumping result, returning 22690 1727204266.99182: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-0000000003f8] 22690 1727204266.99184: sending task result for task 127b8e07-fff9-78bb-bf56-0000000003f8 ok: [managed-node2] 22690 1727204267.00406: no more pending results, returning what we have 22690 1727204267.00411: results queue empty 22690 1727204267.00412: checking for any_errors_fatal 22690 1727204267.00413: done checking for any_errors_fatal 22690 1727204267.00414: checking for max_fail_percentage 22690 1727204267.00416: done checking for max_fail_percentage 22690 1727204267.00417: checking to see if all hosts have failed and the running result is not ok 22690 1727204267.00418: done checking to see if all hosts have failed 22690 1727204267.00419: getting the remaining hosts for this loop 22690 1727204267.00420: done getting the remaining hosts for this loop 22690 1727204267.00426: getting the next task for host managed-node2 22690 1727204267.00432: done getting next task for host managed-node2 22690 1727204267.00434: ^ task is: TASK: meta (flush_handlers) 22690 1727204267.00437: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204267.00467: getting variables 22690 1727204267.00470: in VariableManager get_vars() 22690 1727204267.00576: Calling all_inventory to load vars for managed-node2 22690 1727204267.00582: Calling groups_inventory to load vars for managed-node2 22690 1727204267.00590: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.00602: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000003f8 22690 1727204267.00606: WORKER PROCESS EXITING 22690 1727204267.00624: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.00627: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.00631: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.04284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.09317: done with get_vars() 22690 1727204267.09357: done getting variables 22690 1727204267.09487: in VariableManager get_vars() 22690 1727204267.09499: Calling all_inventory to load vars for managed-node2 22690 1727204267.09502: Calling groups_inventory to load vars for managed-node2 22690 1727204267.09504: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.09510: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.09512: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.09516: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.13255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.16917: done with get_vars() 22690 1727204267.16971: done queuing things up, now waiting for results queue to drain 22690 1727204267.16974: results queue empty 22690 1727204267.16975: checking for any_errors_fatal 22690 1727204267.16980: done checking for any_errors_fatal 22690 1727204267.16981: checking for max_fail_percentage 22690 1727204267.16982: done checking for max_fail_percentage 22690 1727204267.16983: checking to see if all hosts have failed and the running result is not ok 22690 1727204267.16989: done checking to see if all hosts have failed 22690 1727204267.16990: getting the remaining hosts for this loop 22690 1727204267.16991: done getting the remaining hosts for this loop 22690 1727204267.16994: getting the next task for host managed-node2 22690 1727204267.16999: done getting next task for host managed-node2 22690 1727204267.17002: ^ task is: TASK: Include the task 'delete_interface.yml' 22690 1727204267.17004: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204267.17006: getting variables 22690 1727204267.17007: in VariableManager get_vars() 22690 1727204267.17018: Calling all_inventory to load vars for managed-node2 22690 1727204267.17021: Calling groups_inventory to load vars for managed-node2 22690 1727204267.17023: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.17031: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.17033: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.17036: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.18663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.20958: done with get_vars() 22690 1727204267.20994: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:57:47 -0400 (0:00:01.266) 0:00:34.494 ***** 22690 1727204267.21083: entering _queue_task() for managed-node2/include_tasks 22690 1727204267.21722: worker is 1 (out of 1 available) 22690 1727204267.21736: exiting _queue_task() for managed-node2/include_tasks 22690 1727204267.21749: done queuing things up, now waiting for results queue to drain 22690 1727204267.21750: waiting for pending results... 22690 1727204267.21956: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 22690 1727204267.22097: in run() - task 127b8e07-fff9-78bb-bf56-000000000054 22690 1727204267.22119: variable 'ansible_search_path' from source: unknown 22690 1727204267.22174: calling self._execute() 22690 1727204267.22284: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204267.22299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204267.22312: variable 'omit' from source: magic vars 22690 1727204267.22773: variable 'ansible_distribution_major_version' from source: facts 22690 1727204267.22792: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204267.22805: _execute() done 22690 1727204267.22822: dumping result to json 22690 1727204267.22841: done dumping result, returning 22690 1727204267.22870: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [127b8e07-fff9-78bb-bf56-000000000054] 22690 1727204267.22874: sending task result for task 127b8e07-fff9-78bb-bf56-000000000054 22690 1727204267.23145: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000054 22690 1727204267.23150: WORKER PROCESS EXITING 22690 1727204267.23184: no more pending results, returning what we have 22690 1727204267.23190: in VariableManager get_vars() 22690 1727204267.23231: Calling all_inventory to load vars for managed-node2 22690 1727204267.23235: Calling groups_inventory to load vars for managed-node2 22690 1727204267.23239: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.23373: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.23377: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.23381: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.25311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.27622: done with get_vars() 22690 1727204267.27655: variable 'ansible_search_path' from source: unknown 22690 1727204267.27677: we have included files to process 22690 1727204267.27678: generating all_blocks data 22690 1727204267.27680: done generating all_blocks data 22690 1727204267.27681: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22690 1727204267.27682: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22690 1727204267.27689: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22690 1727204267.27956: done processing included file 22690 1727204267.27958: iterating over new_blocks loaded from include file 22690 1727204267.27959: in VariableManager get_vars() 22690 1727204267.27976: done with get_vars() 22690 1727204267.27979: filtering new block on tags 22690 1727204267.27995: done filtering new block on tags 22690 1727204267.27999: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 22690 1727204267.28004: extending task lists for all hosts with included blocks 22690 1727204267.28047: done extending task lists 22690 1727204267.28048: done processing included files 22690 1727204267.28049: results queue empty 22690 1727204267.28050: checking for any_errors_fatal 22690 1727204267.28052: done checking for any_errors_fatal 22690 1727204267.28052: checking for max_fail_percentage 22690 1727204267.28053: done checking for max_fail_percentage 22690 1727204267.28054: checking to see if all hosts have failed and the running result is not ok 22690 1727204267.28055: done checking to see if all hosts have failed 22690 1727204267.28056: getting the remaining hosts for this loop 22690 1727204267.28057: done getting the remaining hosts for this loop 22690 1727204267.28060: getting the next task for host managed-node2 22690 1727204267.28063: done getting next task for host managed-node2 22690 1727204267.28068: ^ task is: TASK: Remove test interface if necessary 22690 1727204267.28070: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204267.28072: getting variables 22690 1727204267.28073: in VariableManager get_vars() 22690 1727204267.28082: Calling all_inventory to load vars for managed-node2 22690 1727204267.28085: Calling groups_inventory to load vars for managed-node2 22690 1727204267.28087: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.28093: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.28095: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.28097: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.29816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.32814: done with get_vars() 22690 1727204267.32850: done getting variables 22690 1727204267.33022: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.119) 0:00:34.613 ***** 22690 1727204267.33056: entering _queue_task() for managed-node2/command 22690 1727204267.33920: worker is 1 (out of 1 available) 22690 1727204267.33936: exiting _queue_task() for managed-node2/command 22690 1727204267.34008: done queuing things up, now waiting for results queue to drain 22690 1727204267.34011: waiting for pending results... 22690 1727204267.34580: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 22690 1727204267.34673: in run() - task 127b8e07-fff9-78bb-bf56-000000000409 22690 1727204267.35162: variable 'ansible_search_path' from source: unknown 22690 1727204267.35170: variable 'ansible_search_path' from source: unknown 22690 1727204267.35174: calling self._execute() 22690 1727204267.35253: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204267.35263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204267.35271: variable 'omit' from source: magic vars 22690 1727204267.36903: variable 'ansible_distribution_major_version' from source: facts 22690 1727204267.36913: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204267.36953: variable 'omit' from source: magic vars 22690 1727204267.36971: variable 'omit' from source: magic vars 22690 1727204267.37497: variable 'interface' from source: set_fact 22690 1727204267.37574: variable 'omit' from source: magic vars 22690 1727204267.37578: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204267.37614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204267.37640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204267.37659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204267.38081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204267.38132: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204267.38135: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204267.38138: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204267.38241: Set connection var ansible_connection to ssh 22690 1727204267.38249: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204267.38257: Set connection var ansible_pipelining to False 22690 1727204267.38260: Set connection var ansible_shell_type to sh 22690 1727204267.38268: Set connection var ansible_shell_executable to /bin/sh 22690 1727204267.38686: Set connection var ansible_timeout to 10 22690 1727204267.38715: variable 'ansible_shell_executable' from source: unknown 22690 1727204267.38722: variable 'ansible_connection' from source: unknown 22690 1727204267.38725: variable 'ansible_module_compression' from source: unknown 22690 1727204267.38727: variable 'ansible_shell_type' from source: unknown 22690 1727204267.38732: variable 'ansible_shell_executable' from source: unknown 22690 1727204267.38734: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204267.38740: variable 'ansible_pipelining' from source: unknown 22690 1727204267.38743: variable 'ansible_timeout' from source: unknown 22690 1727204267.38748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204267.39319: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204267.39336: variable 'omit' from source: magic vars 22690 1727204267.39340: starting attempt loop 22690 1727204267.39343: running the handler 22690 1727204267.39371: _low_level_execute_command(): starting 22690 1727204267.39374: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204267.41274: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204267.41279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204267.41282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204267.41285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204267.41588: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.43339: stdout chunk (state=3): >>>/root <<< 22690 1727204267.43445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204267.43921: stderr chunk (state=3): >>><<< 22690 1727204267.43925: stdout chunk (state=3): >>><<< 22690 1727204267.43932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204267.43935: _low_level_execute_command(): starting 22690 1727204267.43938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021 `" && echo ansible-tmp-1727204267.4381742-25064-238759370728021="` echo /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021 `" ) && sleep 0' 22690 1727204267.45012: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204267.45102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204267.45124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204267.45152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204267.45374: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204267.45393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204267.45445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.47563: stdout chunk (state=3): >>>ansible-tmp-1727204267.4381742-25064-238759370728021=/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021 <<< 22690 1727204267.47689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204267.47738: stderr chunk (state=3): >>><<< 22690 1727204267.47989: stdout chunk (state=3): >>><<< 22690 1727204267.48025: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204267.4381742-25064-238759370728021=/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204267.48042: variable 'ansible_module_compression' from source: unknown 22690 1727204267.48244: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204267.48247: variable 'ansible_facts' from source: unknown 22690 1727204267.48358: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py 22690 1727204267.48845: Sending initial data 22690 1727204267.48848: Sent initial data (156 bytes) 22690 1727204267.50034: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204267.50185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204267.50336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204267.50356: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204267.50453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.52300: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204267.52468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204267.52712: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmplm7ymfk2 /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py <<< 22690 1727204267.52720: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py" <<< 22690 1727204267.52774: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmplm7ymfk2" to remote "/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py" <<< 22690 1727204267.55096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204267.55101: stdout chunk (state=3): >>><<< 22690 1727204267.55139: stderr chunk (state=3): >>><<< 22690 1727204267.55142: done transferring module to remote 22690 1727204267.55175: _low_level_execute_command(): starting 22690 1727204267.55178: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/ /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py && sleep 0' 22690 1727204267.56378: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204267.56383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204267.56563: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204267.56570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204267.56572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204267.56705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204267.56812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.58674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204267.58830: stderr chunk (state=3): >>><<< 22690 1727204267.58834: stdout chunk (state=3): >>><<< 22690 1727204267.58887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204267.58897: _low_level_execute_command(): starting 22690 1727204267.58900: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/AnsiballZ_command.py && sleep 0' 22690 1727204267.60788: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204267.60987: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204267.61304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204267.61321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204267.61420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.79479: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:47.777921", "end": "2024-09-24 14:57:47.789616", "delta": "0:00:00.011695", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204267.81606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204267.81797: stderr chunk (state=3): >>><<< 22690 1727204267.81809: stdout chunk (state=3): >>><<< 22690 1727204267.81838: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:47.777921", "end": "2024-09-24 14:57:47.789616", "delta": "0:00:00.011695", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204267.81883: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204267.81891: _low_level_execute_command(): starting 22690 1727204267.81897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204267.4381742-25064-238759370728021/ > /dev/null 2>&1 && sleep 0' 22690 1727204267.83721: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204267.83741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204267.83757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204267.83984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204267.84191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204267.84334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204267.86376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204267.86380: stdout chunk (state=3): >>><<< 22690 1727204267.86399: stderr chunk (state=3): >>><<< 22690 1727204267.86430: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204267.86448: handler run complete 22690 1727204267.86671: Evaluated conditional (False): False 22690 1727204267.86674: attempt loop complete, returning result 22690 1727204267.86677: _execute() done 22690 1727204267.86680: dumping result to json 22690 1727204267.86682: done dumping result, returning 22690 1727204267.86685: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [127b8e07-fff9-78bb-bf56-000000000409] 22690 1727204267.86687: sending task result for task 127b8e07-fff9-78bb-bf56-000000000409 ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.011695", "end": "2024-09-24 14:57:47.789616", "rc": 0, "start": "2024-09-24 14:57:47.777921" } 22690 1727204267.87104: no more pending results, returning what we have 22690 1727204267.87108: results queue empty 22690 1727204267.87109: checking for any_errors_fatal 22690 1727204267.87111: done checking for any_errors_fatal 22690 1727204267.87111: checking for max_fail_percentage 22690 1727204267.87113: done checking for max_fail_percentage 22690 1727204267.87114: checking to see if all hosts have failed and the running result is not ok 22690 1727204267.87115: done checking to see if all hosts have failed 22690 1727204267.87116: getting the remaining hosts for this loop 22690 1727204267.87117: done getting the remaining hosts for this loop 22690 1727204267.87286: getting the next task for host managed-node2 22690 1727204267.87295: done getting next task for host managed-node2 22690 1727204267.87297: ^ task is: TASK: meta (flush_handlers) 22690 1727204267.87299: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204267.87303: getting variables 22690 1727204267.87306: in VariableManager get_vars() 22690 1727204267.87336: Calling all_inventory to load vars for managed-node2 22690 1727204267.87338: Calling groups_inventory to load vars for managed-node2 22690 1727204267.87342: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.87350: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000409 22690 1727204267.87357: WORKER PROCESS EXITING 22690 1727204267.87405: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.87410: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.87413: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.89986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204267.94046: done with get_vars() 22690 1727204267.94091: done getting variables 22690 1727204267.94209: in VariableManager get_vars() 22690 1727204267.94222: Calling all_inventory to load vars for managed-node2 22690 1727204267.94224: Calling groups_inventory to load vars for managed-node2 22690 1727204267.94227: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204267.94238: Calling all_plugins_play to load vars for managed-node2 22690 1727204267.94247: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204267.94252: Calling groups_plugins_play to load vars for managed-node2 22690 1727204267.96791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204268.00591: done with get_vars() 22690 1727204268.00716: done queuing things up, now waiting for results queue to drain 22690 1727204268.00718: results queue empty 22690 1727204268.00719: checking for any_errors_fatal 22690 1727204268.00723: done checking for any_errors_fatal 22690 1727204268.00724: checking for max_fail_percentage 22690 1727204268.00726: done checking for max_fail_percentage 22690 1727204268.00726: checking to see if all hosts have failed and the running result is not ok 22690 1727204268.00727: done checking to see if all hosts have failed 22690 1727204268.00728: getting the remaining hosts for this loop 22690 1727204268.00760: done getting the remaining hosts for this loop 22690 1727204268.00767: getting the next task for host managed-node2 22690 1727204268.00773: done getting next task for host managed-node2 22690 1727204268.00775: ^ task is: TASK: meta (flush_handlers) 22690 1727204268.00777: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204268.00806: getting variables 22690 1727204268.00809: in VariableManager get_vars() 22690 1727204268.00824: Calling all_inventory to load vars for managed-node2 22690 1727204268.00827: Calling groups_inventory to load vars for managed-node2 22690 1727204268.00829: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204268.00836: Calling all_plugins_play to load vars for managed-node2 22690 1727204268.00839: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204268.00844: Calling groups_plugins_play to load vars for managed-node2 22690 1727204268.04438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204268.09344: done with get_vars() 22690 1727204268.09381: done getting variables 22690 1727204268.09431: in VariableManager get_vars() 22690 1727204268.09446: Calling all_inventory to load vars for managed-node2 22690 1727204268.09451: Calling groups_inventory to load vars for managed-node2 22690 1727204268.09455: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204268.09462: Calling all_plugins_play to load vars for managed-node2 22690 1727204268.09464: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204268.09472: Calling groups_plugins_play to load vars for managed-node2 22690 1727204268.15133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204268.16319: done with get_vars() 22690 1727204268.16351: done queuing things up, now waiting for results queue to drain 22690 1727204268.16353: results queue empty 22690 1727204268.16353: checking for any_errors_fatal 22690 1727204268.16355: done checking for any_errors_fatal 22690 1727204268.16355: checking for max_fail_percentage 22690 1727204268.16356: done checking for max_fail_percentage 22690 1727204268.16356: checking to see if all hosts have failed and the running result is not ok 22690 1727204268.16357: done checking to see if all hosts have failed 22690 1727204268.16357: getting the remaining hosts for this loop 22690 1727204268.16359: done getting the remaining hosts for this loop 22690 1727204268.16362: getting the next task for host managed-node2 22690 1727204268.16367: done getting next task for host managed-node2 22690 1727204268.16367: ^ task is: None 22690 1727204268.16369: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204268.16370: done queuing things up, now waiting for results queue to drain 22690 1727204268.16371: results queue empty 22690 1727204268.16371: checking for any_errors_fatal 22690 1727204268.16371: done checking for any_errors_fatal 22690 1727204268.16372: checking for max_fail_percentage 22690 1727204268.16373: done checking for max_fail_percentage 22690 1727204268.16373: checking to see if all hosts have failed and the running result is not ok 22690 1727204268.16374: done checking to see if all hosts have failed 22690 1727204268.16374: getting the next task for host managed-node2 22690 1727204268.16381: done getting next task for host managed-node2 22690 1727204268.16382: ^ task is: None 22690 1727204268.16382: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204268.16416: in VariableManager get_vars() 22690 1727204268.16433: done with get_vars() 22690 1727204268.16437: in VariableManager get_vars() 22690 1727204268.16445: done with get_vars() 22690 1727204268.16454: variable 'omit' from source: magic vars 22690 1727204268.16572: variable 'profile' from source: play vars 22690 1727204268.16672: in VariableManager get_vars() 22690 1727204268.16687: done with get_vars() 22690 1727204268.16709: variable 'omit' from source: magic vars 22690 1727204268.16778: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 22690 1727204268.17673: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204268.17739: getting the remaining hosts for this loop 22690 1727204268.17741: done getting the remaining hosts for this loop 22690 1727204268.17746: getting the next task for host managed-node2 22690 1727204268.17749: done getting next task for host managed-node2 22690 1727204268.17752: ^ task is: TASK: Gathering Facts 22690 1727204268.17753: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204268.17756: getting variables 22690 1727204268.17757: in VariableManager get_vars() 22690 1727204268.17774: Calling all_inventory to load vars for managed-node2 22690 1727204268.17776: Calling groups_inventory to load vars for managed-node2 22690 1727204268.17777: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204268.17782: Calling all_plugins_play to load vars for managed-node2 22690 1727204268.17784: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204268.17786: Calling groups_plugins_play to load vars for managed-node2 22690 1727204268.19080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204268.21096: done with get_vars() 22690 1727204268.21130: done getting variables 22690 1727204268.21181: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.881) 0:00:35.495 ***** 22690 1727204268.21209: entering _queue_task() for managed-node2/gather_facts 22690 1727204268.21585: worker is 1 (out of 1 available) 22690 1727204268.21598: exiting _queue_task() for managed-node2/gather_facts 22690 1727204268.21613: done queuing things up, now waiting for results queue to drain 22690 1727204268.21614: waiting for pending results... 22690 1727204268.22088: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204268.22093: in run() - task 127b8e07-fff9-78bb-bf56-000000000417 22690 1727204268.22097: variable 'ansible_search_path' from source: unknown 22690 1727204268.22116: calling self._execute() 22690 1727204268.22243: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204268.22258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204268.22276: variable 'omit' from source: magic vars 22690 1727204268.22614: variable 'ansible_distribution_major_version' from source: facts 22690 1727204268.22626: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204268.22633: variable 'omit' from source: magic vars 22690 1727204268.22659: variable 'omit' from source: magic vars 22690 1727204268.22769: variable 'omit' from source: magic vars 22690 1727204268.22775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204268.22822: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204268.22828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204268.22847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204268.22922: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204268.22926: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204268.22929: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204268.22932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204268.23171: Set connection var ansible_connection to ssh 22690 1727204268.23175: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204268.23177: Set connection var ansible_pipelining to False 22690 1727204268.23179: Set connection var ansible_shell_type to sh 22690 1727204268.23181: Set connection var ansible_shell_executable to /bin/sh 22690 1727204268.23184: Set connection var ansible_timeout to 10 22690 1727204268.23186: variable 'ansible_shell_executable' from source: unknown 22690 1727204268.23189: variable 'ansible_connection' from source: unknown 22690 1727204268.23191: variable 'ansible_module_compression' from source: unknown 22690 1727204268.23194: variable 'ansible_shell_type' from source: unknown 22690 1727204268.23197: variable 'ansible_shell_executable' from source: unknown 22690 1727204268.23200: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204268.23202: variable 'ansible_pipelining' from source: unknown 22690 1727204268.23205: variable 'ansible_timeout' from source: unknown 22690 1727204268.23207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204268.23377: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204268.23395: variable 'omit' from source: magic vars 22690 1727204268.23411: starting attempt loop 22690 1727204268.23422: running the handler 22690 1727204268.23446: variable 'ansible_facts' from source: unknown 22690 1727204268.23503: _low_level_execute_command(): starting 22690 1727204268.23524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204268.24377: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.24439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204268.24454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204268.24557: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204268.26290: stdout chunk (state=3): >>>/root <<< 22690 1727204268.26395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204268.26460: stderr chunk (state=3): >>><<< 22690 1727204268.26464: stdout chunk (state=3): >>><<< 22690 1727204268.26491: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204268.26505: _low_level_execute_command(): starting 22690 1727204268.26511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714 `" && echo ansible-tmp-1727204268.2649088-25100-73257461578714="` echo /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714 `" ) && sleep 0' 22690 1727204268.27019: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204268.27023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.27026: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204268.27036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.27085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204268.27089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204268.27093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204268.27171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204268.29154: stdout chunk (state=3): >>>ansible-tmp-1727204268.2649088-25100-73257461578714=/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714 <<< 22690 1727204268.29259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204268.29321: stderr chunk (state=3): >>><<< 22690 1727204268.29325: stdout chunk (state=3): >>><<< 22690 1727204268.29343: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204268.2649088-25100-73257461578714=/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204268.29374: variable 'ansible_module_compression' from source: unknown 22690 1727204268.29418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204268.29482: variable 'ansible_facts' from source: unknown 22690 1727204268.29621: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py 22690 1727204268.29747: Sending initial data 22690 1727204268.29750: Sent initial data (153 bytes) 22690 1727204268.30259: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204268.30263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.30268: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204268.30271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.30327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204268.30332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204268.30403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204268.31998: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204268.32066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204268.32133: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpm0l4kj5q /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py <<< 22690 1727204268.32137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py" <<< 22690 1727204268.32202: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpm0l4kj5q" to remote "/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py" <<< 22690 1727204268.33610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204268.33671: stderr chunk (state=3): >>><<< 22690 1727204268.33676: stdout chunk (state=3): >>><<< 22690 1727204268.33700: done transferring module to remote 22690 1727204268.33712: _low_level_execute_command(): starting 22690 1727204268.33746: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/ /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py && sleep 0' 22690 1727204268.34307: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204268.34311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.34314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204268.34317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.34384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204268.34391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204268.34393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204268.34457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204268.36273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204268.36376: stderr chunk (state=3): >>><<< 22690 1727204268.36412: stdout chunk (state=3): >>><<< 22690 1727204268.36417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204268.36420: _low_level_execute_command(): starting 22690 1727204268.36422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/AnsiballZ_setup.py && sleep 0' 22690 1727204268.37108: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204268.37134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204268.37171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204268.37249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204269.01777: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3053, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 663, "free": 3053}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 615, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316551680, "block_size": 4096, "block_total": 64479564, "block_available": 61356580, "block_used": 3122984, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ans<<< 22690 1727204269.01786: stdout chunk (state=3): >>>ible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "49", "epoch": "1727204269", "epoch_int": "1727204269", "date": "2024-09-24", "time": "14:57:49", "iso8601_micro": "2024-09-24T18:57:49.014178Z", "iso8601": "2024-09-24T18:57:49Z", "iso8601_basic": "20240924T145749014178", "iso8601_basic_short": "20240924T145749", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.56982421875, "5m": 0.521484375, "15m": 0.287109375}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204269.03803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204269.03867: stderr chunk (state=3): >>><<< 22690 1727204269.03876: stdout chunk (state=3): >>><<< 22690 1727204269.03906: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3053, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 663, "free": 3053}, "nocache": {"free": 3485, "used": 231}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 615, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316551680, "block_size": 4096, "block_total": 64479564, "block_available": 61356580, "block_used": 3122984, "inode_total": 16384000, "inode_available": 16301509, "inode_used": 82491, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "49", "epoch": "1727204269", "epoch_int": "1727204269", "date": "2024-09-24", "time": "14:57:49", "iso8601_micro": "2024-09-24T18:57:49.014178Z", "iso8601": "2024-09-24T18:57:49Z", "iso8601_basic": "20240924T145749014178", "iso8601_basic_short": "20240924T145749", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.56982421875, "5m": 0.521484375, "15m": 0.287109375}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204269.04130: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204269.04151: _low_level_execute_command(): starting 22690 1727204269.04155: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204268.2649088-25100-73257461578714/ > /dev/null 2>&1 && sleep 0' 22690 1727204269.04799: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204269.04805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204269.04808: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204269.04811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204269.04815: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204269.04849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204269.04876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204269.04978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204269.06882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204269.06950: stderr chunk (state=3): >>><<< 22690 1727204269.06954: stdout chunk (state=3): >>><<< 22690 1727204269.06969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204269.06978: handler run complete 22690 1727204269.07070: variable 'ansible_facts' from source: unknown 22690 1727204269.07146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.07466: variable 'ansible_facts' from source: unknown 22690 1727204269.07523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.07606: attempt loop complete, returning result 22690 1727204269.07610: _execute() done 22690 1727204269.07613: dumping result to json 22690 1727204269.07630: done dumping result, returning 22690 1727204269.07637: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-000000000417] 22690 1727204269.07642: sending task result for task 127b8e07-fff9-78bb-bf56-000000000417 22690 1727204269.07901: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000417 22690 1727204269.07904: WORKER PROCESS EXITING ok: [managed-node2] 22690 1727204269.08178: no more pending results, returning what we have 22690 1727204269.08181: results queue empty 22690 1727204269.08182: checking for any_errors_fatal 22690 1727204269.08183: done checking for any_errors_fatal 22690 1727204269.08184: checking for max_fail_percentage 22690 1727204269.08186: done checking for max_fail_percentage 22690 1727204269.08186: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.08187: done checking to see if all hosts have failed 22690 1727204269.08188: getting the remaining hosts for this loop 22690 1727204269.08189: done getting the remaining hosts for this loop 22690 1727204269.08193: getting the next task for host managed-node2 22690 1727204269.08198: done getting next task for host managed-node2 22690 1727204269.08200: ^ task is: TASK: meta (flush_handlers) 22690 1727204269.08203: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.08206: getting variables 22690 1727204269.08207: in VariableManager get_vars() 22690 1727204269.08263: Calling all_inventory to load vars for managed-node2 22690 1727204269.08275: Calling groups_inventory to load vars for managed-node2 22690 1727204269.08284: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.08296: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.08299: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.08302: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.10400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.12770: done with get_vars() 22690 1727204269.12799: done getting variables 22690 1727204269.12859: in VariableManager get_vars() 22690 1727204269.12873: Calling all_inventory to load vars for managed-node2 22690 1727204269.12875: Calling groups_inventory to load vars for managed-node2 22690 1727204269.12876: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.12880: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.12882: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.12884: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.13829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.15595: done with get_vars() 22690 1727204269.15646: done queuing things up, now waiting for results queue to drain 22690 1727204269.15648: results queue empty 22690 1727204269.15650: checking for any_errors_fatal 22690 1727204269.15654: done checking for any_errors_fatal 22690 1727204269.15655: checking for max_fail_percentage 22690 1727204269.15655: done checking for max_fail_percentage 22690 1727204269.15659: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.15660: done checking to see if all hosts have failed 22690 1727204269.15660: getting the remaining hosts for this loop 22690 1727204269.15661: done getting the remaining hosts for this loop 22690 1727204269.15664: getting the next task for host managed-node2 22690 1727204269.15669: done getting next task for host managed-node2 22690 1727204269.15672: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204269.15673: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.15682: getting variables 22690 1727204269.15683: in VariableManager get_vars() 22690 1727204269.15695: Calling all_inventory to load vars for managed-node2 22690 1727204269.15696: Calling groups_inventory to load vars for managed-node2 22690 1727204269.15698: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.15702: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.15703: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.15705: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.16576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.17820: done with get_vars() 22690 1727204269.17845: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.967) 0:00:36.462 ***** 22690 1727204269.17935: entering _queue_task() for managed-node2/include_tasks 22690 1727204269.18377: worker is 1 (out of 1 available) 22690 1727204269.18392: exiting _queue_task() for managed-node2/include_tasks 22690 1727204269.18407: done queuing things up, now waiting for results queue to drain 22690 1727204269.18408: waiting for pending results... 22690 1727204269.19016: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22690 1727204269.19025: in run() - task 127b8e07-fff9-78bb-bf56-00000000005c 22690 1727204269.19029: variable 'ansible_search_path' from source: unknown 22690 1727204269.19032: variable 'ansible_search_path' from source: unknown 22690 1727204269.19036: calling self._execute() 22690 1727204269.19201: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.19227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.19340: variable 'omit' from source: magic vars 22690 1727204269.19838: variable 'ansible_distribution_major_version' from source: facts 22690 1727204269.19857: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204269.19878: _execute() done 22690 1727204269.19886: dumping result to json 22690 1727204269.19895: done dumping result, returning 22690 1727204269.19973: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [127b8e07-fff9-78bb-bf56-00000000005c] 22690 1727204269.19976: sending task result for task 127b8e07-fff9-78bb-bf56-00000000005c 22690 1727204269.20062: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000005c 22690 1727204269.20067: WORKER PROCESS EXITING 22690 1727204269.20125: no more pending results, returning what we have 22690 1727204269.20131: in VariableManager get_vars() 22690 1727204269.20184: Calling all_inventory to load vars for managed-node2 22690 1727204269.20188: Calling groups_inventory to load vars for managed-node2 22690 1727204269.20190: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.20207: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.20211: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.20217: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.23260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.27109: done with get_vars() 22690 1727204269.27163: variable 'ansible_search_path' from source: unknown 22690 1727204269.27164: variable 'ansible_search_path' from source: unknown 22690 1727204269.27202: we have included files to process 22690 1727204269.27203: generating all_blocks data 22690 1727204269.27205: done generating all_blocks data 22690 1727204269.27206: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204269.27207: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204269.27209: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22690 1727204269.27813: done processing included file 22690 1727204269.27818: iterating over new_blocks loaded from include file 22690 1727204269.27819: in VariableManager get_vars() 22690 1727204269.27845: done with get_vars() 22690 1727204269.27847: filtering new block on tags 22690 1727204269.27866: done filtering new block on tags 22690 1727204269.27870: in VariableManager get_vars() 22690 1727204269.27892: done with get_vars() 22690 1727204269.27894: filtering new block on tags 22690 1727204269.27920: done filtering new block on tags 22690 1727204269.27923: in VariableManager get_vars() 22690 1727204269.27943: done with get_vars() 22690 1727204269.27945: filtering new block on tags 22690 1727204269.27961: done filtering new block on tags 22690 1727204269.27963: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22690 1727204269.27971: extending task lists for all hosts with included blocks 22690 1727204269.28404: done extending task lists 22690 1727204269.28406: done processing included files 22690 1727204269.28407: results queue empty 22690 1727204269.28408: checking for any_errors_fatal 22690 1727204269.28409: done checking for any_errors_fatal 22690 1727204269.28410: checking for max_fail_percentage 22690 1727204269.28411: done checking for max_fail_percentage 22690 1727204269.28412: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.28413: done checking to see if all hosts have failed 22690 1727204269.28413: getting the remaining hosts for this loop 22690 1727204269.28417: done getting the remaining hosts for this loop 22690 1727204269.28420: getting the next task for host managed-node2 22690 1727204269.28424: done getting next task for host managed-node2 22690 1727204269.28426: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204269.28429: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.28439: getting variables 22690 1727204269.28440: in VariableManager get_vars() 22690 1727204269.28456: Calling all_inventory to load vars for managed-node2 22690 1727204269.28458: Calling groups_inventory to load vars for managed-node2 22690 1727204269.28460: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.28472: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.28475: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.28479: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.31977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.36324: done with get_vars() 22690 1727204269.36360: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.185) 0:00:36.647 ***** 22690 1727204269.36453: entering _queue_task() for managed-node2/setup 22690 1727204269.37174: worker is 1 (out of 1 available) 22690 1727204269.37187: exiting _queue_task() for managed-node2/setup 22690 1727204269.37202: done queuing things up, now waiting for results queue to drain 22690 1727204269.37204: waiting for pending results... 22690 1727204269.37863: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22690 1727204269.38037: in run() - task 127b8e07-fff9-78bb-bf56-000000000458 22690 1727204269.38375: variable 'ansible_search_path' from source: unknown 22690 1727204269.38379: variable 'ansible_search_path' from source: unknown 22690 1727204269.38382: calling self._execute() 22690 1727204269.38559: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.38971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.38975: variable 'omit' from source: magic vars 22690 1727204269.39879: variable 'ansible_distribution_major_version' from source: facts 22690 1727204269.39900: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204269.40374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204269.45764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204269.45912: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204269.45954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204269.46054: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204269.46155: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204269.46337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204269.46437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204269.46525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204269.46613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204269.46690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204269.46812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204269.46947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204269.46950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204269.47110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204269.47113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204269.47410: variable '__network_required_facts' from source: role '' defaults 22690 1727204269.47547: variable 'ansible_facts' from source: unknown 22690 1727204269.49545: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22690 1727204269.49612: when evaluation is False, skipping this task 22690 1727204269.49616: _execute() done 22690 1727204269.49619: dumping result to json 22690 1727204269.49622: done dumping result, returning 22690 1727204269.49624: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [127b8e07-fff9-78bb-bf56-000000000458] 22690 1727204269.49627: sending task result for task 127b8e07-fff9-78bb-bf56-000000000458 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204269.49813: no more pending results, returning what we have 22690 1727204269.49820: results queue empty 22690 1727204269.49821: checking for any_errors_fatal 22690 1727204269.49823: done checking for any_errors_fatal 22690 1727204269.49824: checking for max_fail_percentage 22690 1727204269.49826: done checking for max_fail_percentage 22690 1727204269.49827: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.49827: done checking to see if all hosts have failed 22690 1727204269.49828: getting the remaining hosts for this loop 22690 1727204269.49830: done getting the remaining hosts for this loop 22690 1727204269.49836: getting the next task for host managed-node2 22690 1727204269.49846: done getting next task for host managed-node2 22690 1727204269.49850: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204269.49853: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.49869: getting variables 22690 1727204269.49871: in VariableManager get_vars() 22690 1727204269.49914: Calling all_inventory to load vars for managed-node2 22690 1727204269.49919: Calling groups_inventory to load vars for managed-node2 22690 1727204269.49921: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.49934: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.49937: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.49941: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.50558: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000458 22690 1727204269.50562: WORKER PROCESS EXITING 22690 1727204269.54608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.58068: done with get_vars() 22690 1727204269.58209: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.220) 0:00:36.868 ***** 22690 1727204269.58482: entering _queue_task() for managed-node2/stat 22690 1727204269.59263: worker is 1 (out of 1 available) 22690 1727204269.59401: exiting _queue_task() for managed-node2/stat 22690 1727204269.59417: done queuing things up, now waiting for results queue to drain 22690 1727204269.59419: waiting for pending results... 22690 1727204269.59635: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22690 1727204269.59801: in run() - task 127b8e07-fff9-78bb-bf56-00000000045a 22690 1727204269.59831: variable 'ansible_search_path' from source: unknown 22690 1727204269.59865: variable 'ansible_search_path' from source: unknown 22690 1727204269.59895: calling self._execute() 22690 1727204269.60011: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.60024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.60071: variable 'omit' from source: magic vars 22690 1727204269.60486: variable 'ansible_distribution_major_version' from source: facts 22690 1727204269.60504: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204269.60713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204269.61065: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204269.61098: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204269.61146: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204269.61198: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204269.61362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204269.61571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204269.61576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204269.61580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204269.61957: variable '__network_is_ostree' from source: set_fact 22690 1727204269.61961: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204269.61964: when evaluation is False, skipping this task 22690 1727204269.61970: _execute() done 22690 1727204269.61972: dumping result to json 22690 1727204269.61976: done dumping result, returning 22690 1727204269.61990: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [127b8e07-fff9-78bb-bf56-00000000045a] 22690 1727204269.62133: sending task result for task 127b8e07-fff9-78bb-bf56-00000000045a 22690 1727204269.62250: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000045a 22690 1727204269.62254: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204269.62331: no more pending results, returning what we have 22690 1727204269.62336: results queue empty 22690 1727204269.62337: checking for any_errors_fatal 22690 1727204269.62343: done checking for any_errors_fatal 22690 1727204269.62344: checking for max_fail_percentage 22690 1727204269.62346: done checking for max_fail_percentage 22690 1727204269.62347: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.62348: done checking to see if all hosts have failed 22690 1727204269.62348: getting the remaining hosts for this loop 22690 1727204269.62350: done getting the remaining hosts for this loop 22690 1727204269.62355: getting the next task for host managed-node2 22690 1727204269.62361: done getting next task for host managed-node2 22690 1727204269.62468: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204269.62472: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.62492: getting variables 22690 1727204269.62494: in VariableManager get_vars() 22690 1727204269.62541: Calling all_inventory to load vars for managed-node2 22690 1727204269.62545: Calling groups_inventory to load vars for managed-node2 22690 1727204269.62547: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.62561: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.62564: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.62896: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.65841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.68262: done with get_vars() 22690 1727204269.68302: done getting variables 22690 1727204269.68382: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.099) 0:00:36.967 ***** 22690 1727204269.68427: entering _queue_task() for managed-node2/set_fact 22690 1727204269.68976: worker is 1 (out of 1 available) 22690 1727204269.68991: exiting _queue_task() for managed-node2/set_fact 22690 1727204269.69004: done queuing things up, now waiting for results queue to drain 22690 1727204269.69006: waiting for pending results... 22690 1727204269.69399: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22690 1727204269.69404: in run() - task 127b8e07-fff9-78bb-bf56-00000000045b 22690 1727204269.69408: variable 'ansible_search_path' from source: unknown 22690 1727204269.69471: variable 'ansible_search_path' from source: unknown 22690 1727204269.69476: calling self._execute() 22690 1727204269.69593: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.69612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.69632: variable 'omit' from source: magic vars 22690 1727204269.70088: variable 'ansible_distribution_major_version' from source: facts 22690 1727204269.70109: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204269.70319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204269.70691: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204269.70700: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204269.70750: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204269.70797: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204269.70908: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204269.70950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204269.70988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204269.71045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204269.71154: variable '__network_is_ostree' from source: set_fact 22690 1727204269.71172: Evaluated conditional (not __network_is_ostree is defined): False 22690 1727204269.71235: when evaluation is False, skipping this task 22690 1727204269.71238: _execute() done 22690 1727204269.71240: dumping result to json 22690 1727204269.71243: done dumping result, returning 22690 1727204269.71247: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [127b8e07-fff9-78bb-bf56-00000000045b] 22690 1727204269.71249: sending task result for task 127b8e07-fff9-78bb-bf56-00000000045b 22690 1727204269.71577: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000045b 22690 1727204269.71581: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22690 1727204269.71854: no more pending results, returning what we have 22690 1727204269.71858: results queue empty 22690 1727204269.71859: checking for any_errors_fatal 22690 1727204269.71869: done checking for any_errors_fatal 22690 1727204269.71870: checking for max_fail_percentage 22690 1727204269.71872: done checking for max_fail_percentage 22690 1727204269.71873: checking to see if all hosts have failed and the running result is not ok 22690 1727204269.71874: done checking to see if all hosts have failed 22690 1727204269.71875: getting the remaining hosts for this loop 22690 1727204269.71876: done getting the remaining hosts for this loop 22690 1727204269.71881: getting the next task for host managed-node2 22690 1727204269.71890: done getting next task for host managed-node2 22690 1727204269.71895: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204269.71898: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204269.71946: getting variables 22690 1727204269.71948: in VariableManager get_vars() 22690 1727204269.71998: Calling all_inventory to load vars for managed-node2 22690 1727204269.72001: Calling groups_inventory to load vars for managed-node2 22690 1727204269.72004: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204269.72375: Calling all_plugins_play to load vars for managed-node2 22690 1727204269.72388: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204269.72394: Calling groups_plugins_play to load vars for managed-node2 22690 1727204269.75194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204269.78649: done with get_vars() 22690 1727204269.78689: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.103) 0:00:37.071 ***** 22690 1727204269.78795: entering _queue_task() for managed-node2/service_facts 22690 1727204269.79288: worker is 1 (out of 1 available) 22690 1727204269.79307: exiting _queue_task() for managed-node2/service_facts 22690 1727204269.79319: done queuing things up, now waiting for results queue to drain 22690 1727204269.79321: waiting for pending results... 22690 1727204269.79562: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22690 1727204269.79737: in run() - task 127b8e07-fff9-78bb-bf56-00000000045d 22690 1727204269.79741: variable 'ansible_search_path' from source: unknown 22690 1727204269.79745: variable 'ansible_search_path' from source: unknown 22690 1727204269.79847: calling self._execute() 22690 1727204269.79902: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.79915: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.79929: variable 'omit' from source: magic vars 22690 1727204269.80350: variable 'ansible_distribution_major_version' from source: facts 22690 1727204269.80434: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204269.80470: variable 'omit' from source: magic vars 22690 1727204269.80589: variable 'omit' from source: magic vars 22690 1727204269.80716: variable 'omit' from source: magic vars 22690 1727204269.80778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204269.80859: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204269.80877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204269.80901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204269.80949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204269.80997: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204269.81060: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.81064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.81164: Set connection var ansible_connection to ssh 22690 1727204269.81290: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204269.81293: Set connection var ansible_pipelining to False 22690 1727204269.81296: Set connection var ansible_shell_type to sh 22690 1727204269.81298: Set connection var ansible_shell_executable to /bin/sh 22690 1727204269.81300: Set connection var ansible_timeout to 10 22690 1727204269.81333: variable 'ansible_shell_executable' from source: unknown 22690 1727204269.81341: variable 'ansible_connection' from source: unknown 22690 1727204269.81383: variable 'ansible_module_compression' from source: unknown 22690 1727204269.81386: variable 'ansible_shell_type' from source: unknown 22690 1727204269.81393: variable 'ansible_shell_executable' from source: unknown 22690 1727204269.81396: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204269.81398: variable 'ansible_pipelining' from source: unknown 22690 1727204269.81400: variable 'ansible_timeout' from source: unknown 22690 1727204269.81402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204269.81628: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204269.81710: variable 'omit' from source: magic vars 22690 1727204269.81715: starting attempt loop 22690 1727204269.81718: running the handler 22690 1727204269.81721: _low_level_execute_command(): starting 22690 1727204269.81724: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204269.82719: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204269.82800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204269.82824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204269.82900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204269.83068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204269.84921: stdout chunk (state=3): >>>/root <<< 22690 1727204269.85134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204269.85137: stdout chunk (state=3): >>><<< 22690 1727204269.85140: stderr chunk (state=3): >>><<< 22690 1727204269.85142: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204269.85190: _low_level_execute_command(): starting 22690 1727204269.85204: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117 `" && echo ansible-tmp-1727204269.8511965-25146-221304119143117="` echo /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117 `" ) && sleep 0' 22690 1727204269.86645: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204269.86649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204269.86652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204269.86691: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204269.86709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204269.86861: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204269.86923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204269.88971: stdout chunk (state=3): >>>ansible-tmp-1727204269.8511965-25146-221304119143117=/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117 <<< 22690 1727204269.89290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204269.89463: stderr chunk (state=3): >>><<< 22690 1727204269.89567: stdout chunk (state=3): >>><<< 22690 1727204269.89572: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204269.8511965-25146-221304119143117=/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204269.89584: variable 'ansible_module_compression' from source: unknown 22690 1727204269.89674: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22690 1727204269.89885: variable 'ansible_facts' from source: unknown 22690 1727204269.90012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py 22690 1727204269.90443: Sending initial data 22690 1727204269.90454: Sent initial data (162 bytes) 22690 1727204269.91093: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204269.91113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204269.91135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204269.91218: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204269.91225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204269.91304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204269.93176: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204269.93220: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204269.93289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpblutc923 /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py <<< 22690 1727204269.93293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py" <<< 22690 1727204269.93359: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpblutc923" to remote "/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py" <<< 22690 1727204269.95781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204269.95786: stderr chunk (state=3): >>><<< 22690 1727204269.95789: stdout chunk (state=3): >>><<< 22690 1727204269.95833: done transferring module to remote 22690 1727204269.96130: _low_level_execute_command(): starting 22690 1727204269.96135: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/ /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py && sleep 0' 22690 1727204269.97995: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204269.98206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204269.98242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204269.98335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204270.00154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204270.00243: stderr chunk (state=3): >>><<< 22690 1727204270.00250: stdout chunk (state=3): >>><<< 22690 1727204270.00281: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204270.00284: _low_level_execute_command(): starting 22690 1727204270.00288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/AnsiballZ_service_facts.py && sleep 0' 22690 1727204270.01239: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204270.01243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204270.01335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204270.01523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.18264: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind<<< 22690 1727204272.18284: stdout chunk (state=3): >>>.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22690 1727204272.20474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204272.20479: stdout chunk (state=3): >>><<< 22690 1727204272.20481: stderr chunk (state=3): >>><<< 22690 1727204272.20487: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-bsod.service": {"name": "systemd-bsod.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "passim.service": {"name": "passim.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-storagetm.service": {"name": "systemd-storagetm.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204272.22251: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204272.22576: _low_level_execute_command(): starting 22690 1727204272.22581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204269.8511965-25146-221304119143117/ > /dev/null 2>&1 && sleep 0' 22690 1727204272.24099: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204272.24316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.24372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.26442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204272.26453: stdout chunk (state=3): >>><<< 22690 1727204272.26464: stderr chunk (state=3): >>><<< 22690 1727204272.26485: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204272.26497: handler run complete 22690 1727204272.27004: variable 'ansible_facts' from source: unknown 22690 1727204272.27388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204272.28383: variable 'ansible_facts' from source: unknown 22690 1727204272.28568: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204272.28844: attempt loop complete, returning result 22690 1727204272.28857: _execute() done 22690 1727204272.28867: dumping result to json 22690 1727204272.28947: done dumping result, returning 22690 1727204272.28964: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [127b8e07-fff9-78bb-bf56-00000000045d] 22690 1727204272.28976: sending task result for task 127b8e07-fff9-78bb-bf56-00000000045d ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204272.31553: no more pending results, returning what we have 22690 1727204272.31557: results queue empty 22690 1727204272.31558: checking for any_errors_fatal 22690 1727204272.31562: done checking for any_errors_fatal 22690 1727204272.31563: checking for max_fail_percentage 22690 1727204272.31567: done checking for max_fail_percentage 22690 1727204272.31568: checking to see if all hosts have failed and the running result is not ok 22690 1727204272.31569: done checking to see if all hosts have failed 22690 1727204272.31570: getting the remaining hosts for this loop 22690 1727204272.31572: done getting the remaining hosts for this loop 22690 1727204272.31575: getting the next task for host managed-node2 22690 1727204272.31581: done getting next task for host managed-node2 22690 1727204272.31585: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204272.31588: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204272.31598: getting variables 22690 1727204272.31600: in VariableManager get_vars() 22690 1727204272.31637: Calling all_inventory to load vars for managed-node2 22690 1727204272.31640: Calling groups_inventory to load vars for managed-node2 22690 1727204272.31643: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204272.31653: Calling all_plugins_play to load vars for managed-node2 22690 1727204272.31656: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204272.31660: Calling groups_plugins_play to load vars for managed-node2 22690 1727204272.32370: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000045d 22690 1727204272.32377: WORKER PROCESS EXITING 22690 1727204272.36239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204272.41448: done with get_vars() 22690 1727204272.41495: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:52 -0400 (0:00:02.631) 0:00:39.702 ***** 22690 1727204272.41903: entering _queue_task() for managed-node2/package_facts 22690 1727204272.42964: worker is 1 (out of 1 available) 22690 1727204272.43070: exiting _queue_task() for managed-node2/package_facts 22690 1727204272.43084: done queuing things up, now waiting for results queue to drain 22690 1727204272.43085: waiting for pending results... 22690 1727204272.43486: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22690 1727204272.43678: in run() - task 127b8e07-fff9-78bb-bf56-00000000045e 22690 1727204272.43704: variable 'ansible_search_path' from source: unknown 22690 1727204272.43726: variable 'ansible_search_path' from source: unknown 22690 1727204272.43823: calling self._execute() 22690 1727204272.43903: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204272.43918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204272.43943: variable 'omit' from source: magic vars 22690 1727204272.44378: variable 'ansible_distribution_major_version' from source: facts 22690 1727204272.44397: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204272.44410: variable 'omit' from source: magic vars 22690 1727204272.44572: variable 'omit' from source: magic vars 22690 1727204272.44576: variable 'omit' from source: magic vars 22690 1727204272.44601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204272.44648: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204272.44678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204272.44712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204272.44771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204272.44775: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204272.44780: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204272.44789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204272.44910: Set connection var ansible_connection to ssh 22690 1727204272.44936: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204272.44950: Set connection var ansible_pipelining to False 22690 1727204272.44970: Set connection var ansible_shell_type to sh 22690 1727204272.44973: Set connection var ansible_shell_executable to /bin/sh 22690 1727204272.44983: Set connection var ansible_timeout to 10 22690 1727204272.45031: variable 'ansible_shell_executable' from source: unknown 22690 1727204272.45034: variable 'ansible_connection' from source: unknown 22690 1727204272.45037: variable 'ansible_module_compression' from source: unknown 22690 1727204272.45129: variable 'ansible_shell_type' from source: unknown 22690 1727204272.45134: variable 'ansible_shell_executable' from source: unknown 22690 1727204272.45137: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204272.45139: variable 'ansible_pipelining' from source: unknown 22690 1727204272.45141: variable 'ansible_timeout' from source: unknown 22690 1727204272.45144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204272.45347: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204272.45351: variable 'omit' from source: magic vars 22690 1727204272.45354: starting attempt loop 22690 1727204272.45356: running the handler 22690 1727204272.45373: _low_level_execute_command(): starting 22690 1727204272.45455: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204272.46239: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204272.46348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204272.46600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.46705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.48501: stdout chunk (state=3): >>>/root <<< 22690 1727204272.48901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204272.48917: stdout chunk (state=3): >>><<< 22690 1727204272.49125: stderr chunk (state=3): >>><<< 22690 1727204272.49129: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204272.49132: _low_level_execute_command(): starting 22690 1727204272.49135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118 `" && echo ansible-tmp-1727204272.4901133-25245-60447783966118="` echo /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118 `" ) && sleep 0' 22690 1727204272.50328: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204272.50388: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204272.50470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204272.50607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204272.50627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.50705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.52829: stdout chunk (state=3): >>>ansible-tmp-1727204272.4901133-25245-60447783966118=/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118 <<< 22690 1727204272.52833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204272.53274: stderr chunk (state=3): >>><<< 22690 1727204272.53279: stdout chunk (state=3): >>><<< 22690 1727204272.53282: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204272.4901133-25245-60447783966118=/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204272.53285: variable 'ansible_module_compression' from source: unknown 22690 1727204272.53290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22690 1727204272.53469: variable 'ansible_facts' from source: unknown 22690 1727204272.53885: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py 22690 1727204272.54341: Sending initial data 22690 1727204272.54351: Sent initial data (161 bytes) 22690 1727204272.55824: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204272.55871: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204272.55899: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204272.55977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.56176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.57877: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204272.57944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204272.58149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpu3p5d1gl /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py <<< 22690 1727204272.58153: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py" <<< 22690 1727204272.58432: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpu3p5d1gl" to remote "/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py" <<< 22690 1727204272.62654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204272.63325: stderr chunk (state=3): >>><<< 22690 1727204272.63330: stdout chunk (state=3): >>><<< 22690 1727204272.63332: done transferring module to remote 22690 1727204272.63335: _low_level_execute_command(): starting 22690 1727204272.63338: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/ /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py && sleep 0' 22690 1727204272.64654: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204272.64791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204272.64795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204272.64799: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204272.64801: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204272.64804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204272.64984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.65070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204272.66958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204272.67072: stderr chunk (state=3): >>><<< 22690 1727204272.67178: stdout chunk (state=3): >>><<< 22690 1727204272.67182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204272.67190: _low_level_execute_command(): starting 22690 1727204272.67201: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/AnsiballZ_package_facts.py && sleep 0' 22690 1727204272.68619: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204272.68687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204272.68816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204272.68972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204272.69042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204273.31957: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, <<< 22690 1727204273.31992: stdout chunk (state=3): >>>"arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "<<< 22690 1727204273.32092: stdout chunk (state=3): >>>systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source"<<< 22690 1727204273.32181: stdout chunk (state=3): >>>: "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoc<<< 22690 1727204273.32196: stdout chunk (state=3): >>>h": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "s<<< 22690 1727204273.32215: stdout chunk (state=3): >>>ource": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22690 1727204273.34038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204273.34175: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204273.34229: stderr chunk (state=3): >>><<< 22690 1727204273.34276: stdout chunk (state=3): >>><<< 22690 1727204273.34301: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "14.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "12.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "40", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "40", "release": "39", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "20.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240301", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "18.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.45.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "29.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "8.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.27", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.49", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.59.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "9.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.2.3", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtextstyle": [{"name": "libtextstyle", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "643", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.10.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "27.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "56.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.15", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "2.1.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "9.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.1", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240725", "release": "1.git28d3e2d.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.7", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.4", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.62_v7.0.401", "release": "6.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "13.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "75.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "255.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "124", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "28.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim-libs": [{"name": "passim-libs", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.3", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.30.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.23.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.78", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.80.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.48.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gsettings-desktop-schemas": [{"name": "gsettings-desktop-schemas", "version": "46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libproxy": [{"name": "libproxy", "version": "0.5.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib-networking": [{"name": "glib-networking", "version": "2.80.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsoup3": [{"name": "libsoup3", "version": "3.4.4", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passim": [{"name": "passim", "version": "0.1.7", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.22.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.34.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "5.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.23", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240117", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.104.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "41.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.6.0", "release": "10.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.27", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "14.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "13.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.1.3", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "123.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "13.P1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.46.2", "release": "1.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.6p1", "release": "1.fc40.4", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "2.p5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "13.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "12.20240127.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "36.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "a15b79cc", "release": "63d04c2c", "epoch": null, "arch": null, "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "37.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8.1", "release": "1.fc40", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "63.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "26.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "16.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "40.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.28", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "5.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "2.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "504.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "503.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "6.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "1.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "504.fc40", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "503.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "503.fc40", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "503.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "502.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "502.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.57", "release": "4.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "502.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "502.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "505.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "506.fc40", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.39", "release": "22.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.7.1", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.83.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.6", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2021.11.0", "release": "5.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "6.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile30": [{"name": "guile30", "version": "3.0.7", "release": "12.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "6.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "14.2.1", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cmake-filesystem": [{"name": "cmake-filesystem", "version": "3.28.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat-devel": [{"name": "zlib-ng-compat-devel", "version": "2.1.7", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "200.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "10.fc40", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.1.0", "release": "7.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.20", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls-dane": [{"name": "gnutls-dane", "version": "3.8.6", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-libs": [{"name": "wget2-libs", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2": [{"name": "wget2", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget2-wget": [{"name": "wget2-wget", "version": "2.1.0", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "3.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "5.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "506.fc40", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "506.fc40", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "15.fc40", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "11.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "46.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "21.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc40eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc40eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc40", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.18.1", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0.2", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.8", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "14.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "22.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.20", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "9.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "4.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "4.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "8.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.31.0", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.69.20160912git.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "3.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "19.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "7.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "3.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "8.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "2.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "1.fc40", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.11", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.11", "release": "3.fc40", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc40", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204273.37552: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204273.37585: _low_level_execute_command(): starting 22690 1727204273.37598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204272.4901133-25245-60447783966118/ > /dev/null 2>&1 && sleep 0' 22690 1727204273.38344: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204273.38362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204273.38403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204273.38490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204273.38517: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204273.38538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204273.38563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204273.38676: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204273.40772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204273.40776: stdout chunk (state=3): >>><<< 22690 1727204273.40779: stderr chunk (state=3): >>><<< 22690 1727204273.40972: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204273.40976: handler run complete 22690 1727204273.42060: variable 'ansible_facts' from source: unknown 22690 1727204273.42857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204273.45694: variable 'ansible_facts' from source: unknown 22690 1727204273.46272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204273.48074: attempt loop complete, returning result 22690 1727204273.48117: _execute() done 22690 1727204273.48121: dumping result to json 22690 1727204273.48548: done dumping result, returning 22690 1727204273.48552: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [127b8e07-fff9-78bb-bf56-00000000045e] 22690 1727204273.48555: sending task result for task 127b8e07-fff9-78bb-bf56-00000000045e 22690 1727204273.53740: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000045e 22690 1727204273.53744: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204273.53899: no more pending results, returning what we have 22690 1727204273.53902: results queue empty 22690 1727204273.53903: checking for any_errors_fatal 22690 1727204273.53908: done checking for any_errors_fatal 22690 1727204273.53909: checking for max_fail_percentage 22690 1727204273.53911: done checking for max_fail_percentage 22690 1727204273.53912: checking to see if all hosts have failed and the running result is not ok 22690 1727204273.53913: done checking to see if all hosts have failed 22690 1727204273.53916: getting the remaining hosts for this loop 22690 1727204273.53917: done getting the remaining hosts for this loop 22690 1727204273.53921: getting the next task for host managed-node2 22690 1727204273.53928: done getting next task for host managed-node2 22690 1727204273.53932: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204273.53934: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204273.53944: getting variables 22690 1727204273.53945: in VariableManager get_vars() 22690 1727204273.54189: Calling all_inventory to load vars for managed-node2 22690 1727204273.54193: Calling groups_inventory to load vars for managed-node2 22690 1727204273.54196: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204273.54207: Calling all_plugins_play to load vars for managed-node2 22690 1727204273.54211: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204273.54217: Calling groups_plugins_play to load vars for managed-node2 22690 1727204273.56219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204273.60913: done with get_vars() 22690 1727204273.61077: done getting variables 22690 1727204273.61146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:53 -0400 (0:00:01.193) 0:00:40.896 ***** 22690 1727204273.61299: entering _queue_task() for managed-node2/debug 22690 1727204273.62159: worker is 1 (out of 1 available) 22690 1727204273.62175: exiting _queue_task() for managed-node2/debug 22690 1727204273.62189: done queuing things up, now waiting for results queue to drain 22690 1727204273.62191: waiting for pending results... 22690 1727204273.62587: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22690 1727204273.62621: in run() - task 127b8e07-fff9-78bb-bf56-00000000005d 22690 1727204273.62645: variable 'ansible_search_path' from source: unknown 22690 1727204273.62652: variable 'ansible_search_path' from source: unknown 22690 1727204273.62704: calling self._execute() 22690 1727204273.62825: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204273.62837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204273.62852: variable 'omit' from source: magic vars 22690 1727204273.63271: variable 'ansible_distribution_major_version' from source: facts 22690 1727204273.63289: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204273.63336: variable 'omit' from source: magic vars 22690 1727204273.63351: variable 'omit' from source: magic vars 22690 1727204273.63585: variable 'network_provider' from source: set_fact 22690 1727204273.63871: variable 'omit' from source: magic vars 22690 1727204273.63874: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204273.63880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204273.63883: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204273.63886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204273.63889: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204273.63893: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204273.63896: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204273.63899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204273.63994: Set connection var ansible_connection to ssh 22690 1727204273.64020: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204273.64034: Set connection var ansible_pipelining to False 22690 1727204273.64041: Set connection var ansible_shell_type to sh 22690 1727204273.64123: Set connection var ansible_shell_executable to /bin/sh 22690 1727204273.64127: Set connection var ansible_timeout to 10 22690 1727204273.64129: variable 'ansible_shell_executable' from source: unknown 22690 1727204273.64132: variable 'ansible_connection' from source: unknown 22690 1727204273.64134: variable 'ansible_module_compression' from source: unknown 22690 1727204273.64136: variable 'ansible_shell_type' from source: unknown 22690 1727204273.64138: variable 'ansible_shell_executable' from source: unknown 22690 1727204273.64141: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204273.64144: variable 'ansible_pipelining' from source: unknown 22690 1727204273.64146: variable 'ansible_timeout' from source: unknown 22690 1727204273.64149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204273.64317: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204273.64340: variable 'omit' from source: magic vars 22690 1727204273.64351: starting attempt loop 22690 1727204273.64359: running the handler 22690 1727204273.64418: handler run complete 22690 1727204273.64440: attempt loop complete, returning result 22690 1727204273.64557: _execute() done 22690 1727204273.64560: dumping result to json 22690 1727204273.64563: done dumping result, returning 22690 1727204273.64568: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [127b8e07-fff9-78bb-bf56-00000000005d] 22690 1727204273.64570: sending task result for task 127b8e07-fff9-78bb-bf56-00000000005d 22690 1727204273.64650: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000005d 22690 1727204273.64654: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 22690 1727204273.64733: no more pending results, returning what we have 22690 1727204273.64737: results queue empty 22690 1727204273.64738: checking for any_errors_fatal 22690 1727204273.64751: done checking for any_errors_fatal 22690 1727204273.64752: checking for max_fail_percentage 22690 1727204273.64753: done checking for max_fail_percentage 22690 1727204273.64755: checking to see if all hosts have failed and the running result is not ok 22690 1727204273.64756: done checking to see if all hosts have failed 22690 1727204273.64757: getting the remaining hosts for this loop 22690 1727204273.64758: done getting the remaining hosts for this loop 22690 1727204273.64763: getting the next task for host managed-node2 22690 1727204273.64772: done getting next task for host managed-node2 22690 1727204273.64776: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204273.64779: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204273.64789: getting variables 22690 1727204273.64792: in VariableManager get_vars() 22690 1727204273.64838: Calling all_inventory to load vars for managed-node2 22690 1727204273.64842: Calling groups_inventory to load vars for managed-node2 22690 1727204273.64844: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204273.64857: Calling all_plugins_play to load vars for managed-node2 22690 1727204273.64861: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204273.64864: Calling groups_plugins_play to load vars for managed-node2 22690 1727204273.69795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204273.74918: done with get_vars() 22690 1727204273.74963: done getting variables 22690 1727204273.75040: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.137) 0:00:41.034 ***** 22690 1727204273.75085: entering _queue_task() for managed-node2/fail 22690 1727204273.75598: worker is 1 (out of 1 available) 22690 1727204273.75611: exiting _queue_task() for managed-node2/fail 22690 1727204273.75625: done queuing things up, now waiting for results queue to drain 22690 1727204273.75627: waiting for pending results... 22690 1727204273.75886: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22690 1727204273.76004: in run() - task 127b8e07-fff9-78bb-bf56-00000000005e 22690 1727204273.76012: variable 'ansible_search_path' from source: unknown 22690 1727204273.76071: variable 'ansible_search_path' from source: unknown 22690 1727204273.76076: calling self._execute() 22690 1727204273.76190: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204273.76202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204273.76225: variable 'omit' from source: magic vars 22690 1727204273.76678: variable 'ansible_distribution_major_version' from source: facts 22690 1727204273.76697: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204273.76840: variable 'network_state' from source: role '' defaults 22690 1727204273.76872: Evaluated conditional (network_state != {}): False 22690 1727204273.76876: when evaluation is False, skipping this task 22690 1727204273.76878: _execute() done 22690 1727204273.76996: dumping result to json 22690 1727204273.76999: done dumping result, returning 22690 1727204273.77003: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [127b8e07-fff9-78bb-bf56-00000000005e] 22690 1727204273.77006: sending task result for task 127b8e07-fff9-78bb-bf56-00000000005e 22690 1727204273.77087: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000005e 22690 1727204273.77091: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204273.77150: no more pending results, returning what we have 22690 1727204273.77154: results queue empty 22690 1727204273.77156: checking for any_errors_fatal 22690 1727204273.77169: done checking for any_errors_fatal 22690 1727204273.77170: checking for max_fail_percentage 22690 1727204273.77172: done checking for max_fail_percentage 22690 1727204273.77173: checking to see if all hosts have failed and the running result is not ok 22690 1727204273.77174: done checking to see if all hosts have failed 22690 1727204273.77175: getting the remaining hosts for this loop 22690 1727204273.77176: done getting the remaining hosts for this loop 22690 1727204273.77181: getting the next task for host managed-node2 22690 1727204273.77188: done getting next task for host managed-node2 22690 1727204273.77193: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204273.77196: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204273.77277: getting variables 22690 1727204273.77279: in VariableManager get_vars() 22690 1727204273.77445: Calling all_inventory to load vars for managed-node2 22690 1727204273.77449: Calling groups_inventory to load vars for managed-node2 22690 1727204273.77451: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204273.77465: Calling all_plugins_play to load vars for managed-node2 22690 1727204273.77471: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204273.77474: Calling groups_plugins_play to load vars for managed-node2 22690 1727204273.80325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204273.99009: done with get_vars() 22690 1727204273.99056: done getting variables 22690 1727204273.99320: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.242) 0:00:41.276 ***** 22690 1727204273.99350: entering _queue_task() for managed-node2/fail 22690 1727204274.00228: worker is 1 (out of 1 available) 22690 1727204274.00244: exiting _queue_task() for managed-node2/fail 22690 1727204274.00258: done queuing things up, now waiting for results queue to drain 22690 1727204274.00260: waiting for pending results... 22690 1727204274.01028: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22690 1727204274.01266: in run() - task 127b8e07-fff9-78bb-bf56-00000000005f 22690 1727204274.01275: variable 'ansible_search_path' from source: unknown 22690 1727204274.01279: variable 'ansible_search_path' from source: unknown 22690 1727204274.01283: calling self._execute() 22690 1727204274.01592: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.01936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.01941: variable 'omit' from source: magic vars 22690 1727204274.02686: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.02972: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.03149: variable 'network_state' from source: role '' defaults 22690 1727204274.03172: Evaluated conditional (network_state != {}): False 22690 1727204274.03182: when evaluation is False, skipping this task 22690 1727204274.03191: _execute() done 22690 1727204274.03205: dumping result to json 22690 1727204274.03214: done dumping result, returning 22690 1727204274.03228: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [127b8e07-fff9-78bb-bf56-00000000005f] 22690 1727204274.03239: sending task result for task 127b8e07-fff9-78bb-bf56-00000000005f 22690 1727204274.03497: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000005f 22690 1727204274.03501: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204274.03557: no more pending results, returning what we have 22690 1727204274.03562: results queue empty 22690 1727204274.03563: checking for any_errors_fatal 22690 1727204274.03578: done checking for any_errors_fatal 22690 1727204274.03579: checking for max_fail_percentage 22690 1727204274.03581: done checking for max_fail_percentage 22690 1727204274.03581: checking to see if all hosts have failed and the running result is not ok 22690 1727204274.03582: done checking to see if all hosts have failed 22690 1727204274.03583: getting the remaining hosts for this loop 22690 1727204274.03585: done getting the remaining hosts for this loop 22690 1727204274.03589: getting the next task for host managed-node2 22690 1727204274.03596: done getting next task for host managed-node2 22690 1727204274.03601: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204274.03603: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204274.03622: getting variables 22690 1727204274.03625: in VariableManager get_vars() 22690 1727204274.03825: Calling all_inventory to load vars for managed-node2 22690 1727204274.03829: Calling groups_inventory to load vars for managed-node2 22690 1727204274.03832: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204274.03848: Calling all_plugins_play to load vars for managed-node2 22690 1727204274.03852: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204274.03855: Calling groups_plugins_play to load vars for managed-node2 22690 1727204274.08046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204274.12403: done with get_vars() 22690 1727204274.12445: done getting variables 22690 1727204274.12524: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.132) 0:00:41.409 ***** 22690 1727204274.12598: entering _queue_task() for managed-node2/fail 22690 1727204274.13290: worker is 1 (out of 1 available) 22690 1727204274.13304: exiting _queue_task() for managed-node2/fail 22690 1727204274.13318: done queuing things up, now waiting for results queue to drain 22690 1727204274.13320: waiting for pending results... 22690 1727204274.14327: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22690 1727204274.14773: in run() - task 127b8e07-fff9-78bb-bf56-000000000060 22690 1727204274.14778: variable 'ansible_search_path' from source: unknown 22690 1727204274.14781: variable 'ansible_search_path' from source: unknown 22690 1727204274.14784: calling self._execute() 22690 1727204274.14786: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.15272: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.15277: variable 'omit' from source: magic vars 22690 1727204274.16272: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.16277: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.16453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204274.20649: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204274.21880: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204274.21934: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204274.21976: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204274.22201: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204274.22368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.22495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.22535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.22587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.22613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.22734: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.22756: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22690 1727204274.22900: variable 'ansible_distribution' from source: facts 22690 1727204274.22910: variable '__network_rh_distros' from source: role '' defaults 22690 1727204274.22933: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22690 1727204274.22943: when evaluation is False, skipping this task 22690 1727204274.22951: _execute() done 22690 1727204274.22957: dumping result to json 22690 1727204274.22966: done dumping result, returning 22690 1727204274.22979: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [127b8e07-fff9-78bb-bf56-000000000060] 22690 1727204274.22988: sending task result for task 127b8e07-fff9-78bb-bf56-000000000060 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22690 1727204274.23176: no more pending results, returning what we have 22690 1727204274.23181: results queue empty 22690 1727204274.23182: checking for any_errors_fatal 22690 1727204274.23188: done checking for any_errors_fatal 22690 1727204274.23189: checking for max_fail_percentage 22690 1727204274.23191: done checking for max_fail_percentage 22690 1727204274.23191: checking to see if all hosts have failed and the running result is not ok 22690 1727204274.23192: done checking to see if all hosts have failed 22690 1727204274.23193: getting the remaining hosts for this loop 22690 1727204274.23194: done getting the remaining hosts for this loop 22690 1727204274.23200: getting the next task for host managed-node2 22690 1727204274.23206: done getting next task for host managed-node2 22690 1727204274.23210: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204274.23212: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204274.23227: getting variables 22690 1727204274.23229: in VariableManager get_vars() 22690 1727204274.23283: Calling all_inventory to load vars for managed-node2 22690 1727204274.23286: Calling groups_inventory to load vars for managed-node2 22690 1727204274.23288: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204274.23301: Calling all_plugins_play to load vars for managed-node2 22690 1727204274.23305: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204274.23308: Calling groups_plugins_play to load vars for managed-node2 22690 1727204274.23983: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000060 22690 1727204274.23987: WORKER PROCESS EXITING 22690 1727204274.29011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204274.35126: done with get_vars() 22690 1727204274.35334: done getting variables 22690 1727204274.35400: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.228) 0:00:41.637 ***** 22690 1727204274.35431: entering _queue_task() for managed-node2/dnf 22690 1727204274.35833: worker is 1 (out of 1 available) 22690 1727204274.35849: exiting _queue_task() for managed-node2/dnf 22690 1727204274.35861: done queuing things up, now waiting for results queue to drain 22690 1727204274.35862: waiting for pending results... 22690 1727204274.36189: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22690 1727204274.36320: in run() - task 127b8e07-fff9-78bb-bf56-000000000061 22690 1727204274.36350: variable 'ansible_search_path' from source: unknown 22690 1727204274.36359: variable 'ansible_search_path' from source: unknown 22690 1727204274.36414: calling self._execute() 22690 1727204274.36553: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.36566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.36582: variable 'omit' from source: magic vars 22690 1727204274.37031: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.37049: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.37302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204274.39956: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204274.40060: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204274.40110: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204274.40171: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204274.40196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204274.40354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.40360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.40382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.40430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.40448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.40599: variable 'ansible_distribution' from source: facts 22690 1727204274.40612: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.40625: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22690 1727204274.40769: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204274.40937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.41009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.41015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.41058: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.41081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.41139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.41171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.41226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.41257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.41277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.41327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.41371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.41444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.41459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.41482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.41669: variable 'network_connections' from source: play vars 22690 1727204274.41692: variable 'profile' from source: play vars 22690 1727204274.41785: variable 'profile' from source: play vars 22690 1727204274.41794: variable 'interface' from source: set_fact 22690 1727204274.41883: variable 'interface' from source: set_fact 22690 1727204274.41953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204274.42180: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204274.42370: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204274.42374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204274.42377: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204274.42379: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204274.42382: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204274.42427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.42460: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204274.42523: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204274.42816: variable 'network_connections' from source: play vars 22690 1727204274.42834: variable 'profile' from source: play vars 22690 1727204274.42905: variable 'profile' from source: play vars 22690 1727204274.42915: variable 'interface' from source: set_fact 22690 1727204274.42989: variable 'interface' from source: set_fact 22690 1727204274.43051: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204274.43054: when evaluation is False, skipping this task 22690 1727204274.43057: _execute() done 22690 1727204274.43059: dumping result to json 22690 1727204274.43061: done dumping result, returning 22690 1727204274.43064: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000061] 22690 1727204274.43071: sending task result for task 127b8e07-fff9-78bb-bf56-000000000061 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204274.43364: no more pending results, returning what we have 22690 1727204274.43371: results queue empty 22690 1727204274.43372: checking for any_errors_fatal 22690 1727204274.43381: done checking for any_errors_fatal 22690 1727204274.43382: checking for max_fail_percentage 22690 1727204274.43384: done checking for max_fail_percentage 22690 1727204274.43385: checking to see if all hosts have failed and the running result is not ok 22690 1727204274.43386: done checking to see if all hosts have failed 22690 1727204274.43387: getting the remaining hosts for this loop 22690 1727204274.43389: done getting the remaining hosts for this loop 22690 1727204274.43393: getting the next task for host managed-node2 22690 1727204274.43400: done getting next task for host managed-node2 22690 1727204274.43404: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204274.43407: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204274.43423: getting variables 22690 1727204274.43425: in VariableManager get_vars() 22690 1727204274.43584: Calling all_inventory to load vars for managed-node2 22690 1727204274.43591: Calling groups_inventory to load vars for managed-node2 22690 1727204274.43594: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204274.43608: Calling all_plugins_play to load vars for managed-node2 22690 1727204274.43612: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204274.43616: Calling groups_plugins_play to load vars for managed-node2 22690 1727204274.44284: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000061 22690 1727204274.44288: WORKER PROCESS EXITING 22690 1727204274.46655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204274.50815: done with get_vars() 22690 1727204274.50971: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22690 1727204274.51057: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.157) 0:00:41.795 ***** 22690 1727204274.51219: entering _queue_task() for managed-node2/yum 22690 1727204274.52038: worker is 1 (out of 1 available) 22690 1727204274.52053: exiting _queue_task() for managed-node2/yum 22690 1727204274.52469: done queuing things up, now waiting for results queue to drain 22690 1727204274.52472: waiting for pending results... 22690 1727204274.52918: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22690 1727204274.53181: in run() - task 127b8e07-fff9-78bb-bf56-000000000062 22690 1727204274.53243: variable 'ansible_search_path' from source: unknown 22690 1727204274.53401: variable 'ansible_search_path' from source: unknown 22690 1727204274.53455: calling self._execute() 22690 1727204274.53673: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.53677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.53680: variable 'omit' from source: magic vars 22690 1727204274.54073: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.54077: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.54331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204274.58975: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204274.59218: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204274.59273: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204274.59388: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204274.59486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204274.59693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.59732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.59770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.59938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.59961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.60159: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.60238: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22690 1727204274.60247: when evaluation is False, skipping this task 22690 1727204274.60255: _execute() done 22690 1727204274.60263: dumping result to json 22690 1727204274.60371: done dumping result, returning 22690 1727204274.60376: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000062] 22690 1727204274.60379: sending task result for task 127b8e07-fff9-78bb-bf56-000000000062 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22690 1727204274.60650: no more pending results, returning what we have 22690 1727204274.60655: results queue empty 22690 1727204274.60656: checking for any_errors_fatal 22690 1727204274.60668: done checking for any_errors_fatal 22690 1727204274.60670: checking for max_fail_percentage 22690 1727204274.60672: done checking for max_fail_percentage 22690 1727204274.60673: checking to see if all hosts have failed and the running result is not ok 22690 1727204274.60674: done checking to see if all hosts have failed 22690 1727204274.60675: getting the remaining hosts for this loop 22690 1727204274.60676: done getting the remaining hosts for this loop 22690 1727204274.60681: getting the next task for host managed-node2 22690 1727204274.60689: done getting next task for host managed-node2 22690 1727204274.60693: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204274.60696: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204274.60712: getting variables 22690 1727204274.60714: in VariableManager get_vars() 22690 1727204274.60982: Calling all_inventory to load vars for managed-node2 22690 1727204274.60985: Calling groups_inventory to load vars for managed-node2 22690 1727204274.60987: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204274.60999: Calling all_plugins_play to load vars for managed-node2 22690 1727204274.61002: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204274.61005: Calling groups_plugins_play to load vars for managed-node2 22690 1727204274.61876: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000062 22690 1727204274.61881: WORKER PROCESS EXITING 22690 1727204274.65382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204274.69814: done with get_vars() 22690 1727204274.69855: done getting variables 22690 1727204274.70134: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.189) 0:00:41.985 ***** 22690 1727204274.70170: entering _queue_task() for managed-node2/fail 22690 1727204274.70924: worker is 1 (out of 1 available) 22690 1727204274.70939: exiting _queue_task() for managed-node2/fail 22690 1727204274.70954: done queuing things up, now waiting for results queue to drain 22690 1727204274.70955: waiting for pending results... 22690 1727204274.71986: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22690 1727204274.72182: in run() - task 127b8e07-fff9-78bb-bf56-000000000063 22690 1727204274.72538: variable 'ansible_search_path' from source: unknown 22690 1727204274.72543: variable 'ansible_search_path' from source: unknown 22690 1727204274.72547: calling self._execute() 22690 1727204274.73174: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.73178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.73181: variable 'omit' from source: magic vars 22690 1727204274.74292: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.74314: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.74708: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204274.75100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204274.79310: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204274.79807: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204274.79862: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204274.79907: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204274.79941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204274.80047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.80092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.80127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.80292: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.80295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.80298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.80302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.80323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.80376: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.80402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.80453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.80485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.80522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.80573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.80593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.80804: variable 'network_connections' from source: play vars 22690 1727204274.80824: variable 'profile' from source: play vars 22690 1727204274.80916: variable 'profile' from source: play vars 22690 1727204274.80928: variable 'interface' from source: set_fact 22690 1727204274.81006: variable 'interface' from source: set_fact 22690 1727204274.81110: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204274.81433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204274.81481: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204274.81523: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204274.81601: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204274.81641: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204274.81673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204274.81771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.81775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204274.81797: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204274.82089: variable 'network_connections' from source: play vars 22690 1727204274.82101: variable 'profile' from source: play vars 22690 1727204274.82253: variable 'profile' from source: play vars 22690 1727204274.82332: variable 'interface' from source: set_fact 22690 1727204274.82409: variable 'interface' from source: set_fact 22690 1727204274.82441: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204274.82450: when evaluation is False, skipping this task 22690 1727204274.82457: _execute() done 22690 1727204274.82466: dumping result to json 22690 1727204274.82481: done dumping result, returning 22690 1727204274.82495: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000063] 22690 1727204274.82514: sending task result for task 127b8e07-fff9-78bb-bf56-000000000063 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204274.82879: no more pending results, returning what we have 22690 1727204274.82884: results queue empty 22690 1727204274.82885: checking for any_errors_fatal 22690 1727204274.82892: done checking for any_errors_fatal 22690 1727204274.82893: checking for max_fail_percentage 22690 1727204274.82895: done checking for max_fail_percentage 22690 1727204274.82896: checking to see if all hosts have failed and the running result is not ok 22690 1727204274.82897: done checking to see if all hosts have failed 22690 1727204274.82898: getting the remaining hosts for this loop 22690 1727204274.82900: done getting the remaining hosts for this loop 22690 1727204274.82904: getting the next task for host managed-node2 22690 1727204274.82910: done getting next task for host managed-node2 22690 1727204274.82914: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22690 1727204274.82916: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204274.82932: getting variables 22690 1727204274.82933: in VariableManager get_vars() 22690 1727204274.82978: Calling all_inventory to load vars for managed-node2 22690 1727204274.82981: Calling groups_inventory to load vars for managed-node2 22690 1727204274.82984: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204274.82996: Calling all_plugins_play to load vars for managed-node2 22690 1727204274.83000: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204274.83003: Calling groups_plugins_play to load vars for managed-node2 22690 1727204274.83612: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000063 22690 1727204274.83615: WORKER PROCESS EXITING 22690 1727204274.85123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204274.88069: done with get_vars() 22690 1727204274.88114: done getting variables 22690 1727204274.88192: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:54 -0400 (0:00:00.180) 0:00:42.165 ***** 22690 1727204274.88228: entering _queue_task() for managed-node2/package 22690 1727204274.88755: worker is 1 (out of 1 available) 22690 1727204274.88770: exiting _queue_task() for managed-node2/package 22690 1727204274.88783: done queuing things up, now waiting for results queue to drain 22690 1727204274.88784: waiting for pending results... 22690 1727204274.89019: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22690 1727204274.89185: in run() - task 127b8e07-fff9-78bb-bf56-000000000064 22690 1727204274.89227: variable 'ansible_search_path' from source: unknown 22690 1727204274.89250: variable 'ansible_search_path' from source: unknown 22690 1727204274.89329: calling self._execute() 22690 1727204274.89483: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204274.89503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204274.89520: variable 'omit' from source: magic vars 22690 1727204274.90049: variable 'ansible_distribution_major_version' from source: facts 22690 1727204274.90077: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204274.90315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204274.90631: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204274.90693: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204274.90784: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204274.90911: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204274.91034: variable 'network_packages' from source: role '' defaults 22690 1727204274.91172: variable '__network_provider_setup' from source: role '' defaults 22690 1727204274.91190: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204274.91308: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204274.91312: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204274.91371: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204274.91591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204274.94213: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204274.94339: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204274.94359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204274.94408: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204274.94458: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204274.94857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.94861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.94864: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.94869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.94872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.94897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.94915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.94945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.94991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.95013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.95279: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204274.95414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.95441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.95468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.95512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.95534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.95634: variable 'ansible_python' from source: facts 22690 1727204274.95662: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204274.95749: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204274.95829: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204274.96072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.96075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.96078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.96081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.96083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.96271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204274.96282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204274.96285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.96288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204274.96290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204274.96468: variable 'network_connections' from source: play vars 22690 1727204274.96481: variable 'profile' from source: play vars 22690 1727204274.96604: variable 'profile' from source: play vars 22690 1727204274.96619: variable 'interface' from source: set_fact 22690 1727204274.96722: variable 'interface' from source: set_fact 22690 1727204274.96819: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204274.96864: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204274.96905: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204274.96955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204274.97027: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204274.97411: variable 'network_connections' from source: play vars 22690 1727204274.97428: variable 'profile' from source: play vars 22690 1727204274.97563: variable 'profile' from source: play vars 22690 1727204274.97581: variable 'interface' from source: set_fact 22690 1727204274.97701: variable 'interface' from source: set_fact 22690 1727204274.97726: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204274.97854: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204274.98269: variable 'network_connections' from source: play vars 22690 1727204274.98283: variable 'profile' from source: play vars 22690 1727204274.98366: variable 'profile' from source: play vars 22690 1727204274.98397: variable 'interface' from source: set_fact 22690 1727204274.98573: variable 'interface' from source: set_fact 22690 1727204274.98576: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204274.98647: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204274.99211: variable 'network_connections' from source: play vars 22690 1727204274.99226: variable 'profile' from source: play vars 22690 1727204274.99304: variable 'profile' from source: play vars 22690 1727204274.99313: variable 'interface' from source: set_fact 22690 1727204274.99429: variable 'interface' from source: set_fact 22690 1727204274.99505: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204274.99589: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204274.99603: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204274.99766: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204275.00035: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204275.00734: variable 'network_connections' from source: play vars 22690 1727204275.00757: variable 'profile' from source: play vars 22690 1727204275.00836: variable 'profile' from source: play vars 22690 1727204275.00856: variable 'interface' from source: set_fact 22690 1727204275.00931: variable 'interface' from source: set_fact 22690 1727204275.00951: variable 'ansible_distribution' from source: facts 22690 1727204275.00973: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.00984: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.01004: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204275.01218: variable 'ansible_distribution' from source: facts 22690 1727204275.01292: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.01301: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.01305: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204275.01688: variable 'ansible_distribution' from source: facts 22690 1727204275.01692: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.01694: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.01697: variable 'network_provider' from source: set_fact 22690 1727204275.01699: variable 'ansible_facts' from source: unknown 22690 1727204275.03672: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22690 1727204275.03694: when evaluation is False, skipping this task 22690 1727204275.03707: _execute() done 22690 1727204275.03723: dumping result to json 22690 1727204275.03764: done dumping result, returning 22690 1727204275.03783: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [127b8e07-fff9-78bb-bf56-000000000064] 22690 1727204275.03794: sending task result for task 127b8e07-fff9-78bb-bf56-000000000064 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22690 1727204275.04255: no more pending results, returning what we have 22690 1727204275.04261: results queue empty 22690 1727204275.04263: checking for any_errors_fatal 22690 1727204275.04276: done checking for any_errors_fatal 22690 1727204275.04277: checking for max_fail_percentage 22690 1727204275.04279: done checking for max_fail_percentage 22690 1727204275.04280: checking to see if all hosts have failed and the running result is not ok 22690 1727204275.04281: done checking to see if all hosts have failed 22690 1727204275.04282: getting the remaining hosts for this loop 22690 1727204275.04284: done getting the remaining hosts for this loop 22690 1727204275.04289: getting the next task for host managed-node2 22690 1727204275.04296: done getting next task for host managed-node2 22690 1727204275.04301: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204275.04304: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204275.04320: getting variables 22690 1727204275.04322: in VariableManager get_vars() 22690 1727204275.04553: Calling all_inventory to load vars for managed-node2 22690 1727204275.04556: Calling groups_inventory to load vars for managed-node2 22690 1727204275.04559: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204275.04576: Calling all_plugins_play to load vars for managed-node2 22690 1727204275.04591: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204275.04660: Calling groups_plugins_play to load vars for managed-node2 22690 1727204275.05214: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000064 22690 1727204275.05218: WORKER PROCESS EXITING 22690 1727204275.07164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204275.10305: done with get_vars() 22690 1727204275.10638: done getting variables 22690 1727204275.10840: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.226) 0:00:42.392 ***** 22690 1727204275.10893: entering _queue_task() for managed-node2/package 22690 1727204275.11494: worker is 1 (out of 1 available) 22690 1727204275.11621: exiting _queue_task() for managed-node2/package 22690 1727204275.11633: done queuing things up, now waiting for results queue to drain 22690 1727204275.11635: waiting for pending results... 22690 1727204275.11872: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22690 1727204275.12002: in run() - task 127b8e07-fff9-78bb-bf56-000000000065 22690 1727204275.12027: variable 'ansible_search_path' from source: unknown 22690 1727204275.12036: variable 'ansible_search_path' from source: unknown 22690 1727204275.12086: calling self._execute() 22690 1727204275.12274: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.12279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.12282: variable 'omit' from source: magic vars 22690 1727204275.12669: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.12691: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204275.12844: variable 'network_state' from source: role '' defaults 22690 1727204275.12869: Evaluated conditional (network_state != {}): False 22690 1727204275.12878: when evaluation is False, skipping this task 22690 1727204275.12885: _execute() done 22690 1727204275.12893: dumping result to json 22690 1727204275.12901: done dumping result, returning 22690 1727204275.12912: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000065] 22690 1727204275.12965: sending task result for task 127b8e07-fff9-78bb-bf56-000000000065 22690 1727204275.13113: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000065 22690 1727204275.13117: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204275.13207: no more pending results, returning what we have 22690 1727204275.13213: results queue empty 22690 1727204275.13214: checking for any_errors_fatal 22690 1727204275.13222: done checking for any_errors_fatal 22690 1727204275.13223: checking for max_fail_percentage 22690 1727204275.13227: done checking for max_fail_percentage 22690 1727204275.13227: checking to see if all hosts have failed and the running result is not ok 22690 1727204275.13228: done checking to see if all hosts have failed 22690 1727204275.13229: getting the remaining hosts for this loop 22690 1727204275.13232: done getting the remaining hosts for this loop 22690 1727204275.13237: getting the next task for host managed-node2 22690 1727204275.13244: done getting next task for host managed-node2 22690 1727204275.13252: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204275.13254: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204275.13275: getting variables 22690 1727204275.13277: in VariableManager get_vars() 22690 1727204275.13325: Calling all_inventory to load vars for managed-node2 22690 1727204275.13328: Calling groups_inventory to load vars for managed-node2 22690 1727204275.13331: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204275.13347: Calling all_plugins_play to load vars for managed-node2 22690 1727204275.13351: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204275.13354: Calling groups_plugins_play to load vars for managed-node2 22690 1727204275.16428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204275.20345: done with get_vars() 22690 1727204275.20594: done getting variables 22690 1727204275.20668: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.098) 0:00:42.490 ***** 22690 1727204275.20706: entering _queue_task() for managed-node2/package 22690 1727204275.21510: worker is 1 (out of 1 available) 22690 1727204275.21525: exiting _queue_task() for managed-node2/package 22690 1727204275.21540: done queuing things up, now waiting for results queue to drain 22690 1727204275.21542: waiting for pending results... 22690 1727204275.22368: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22690 1727204275.22377: in run() - task 127b8e07-fff9-78bb-bf56-000000000066 22690 1727204275.22381: variable 'ansible_search_path' from source: unknown 22690 1727204275.22384: variable 'ansible_search_path' from source: unknown 22690 1727204275.22401: calling self._execute() 22690 1727204275.22759: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.22764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.22777: variable 'omit' from source: magic vars 22690 1727204275.23652: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.23663: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204275.23909: variable 'network_state' from source: role '' defaults 22690 1727204275.23926: Evaluated conditional (network_state != {}): False 22690 1727204275.23930: when evaluation is False, skipping this task 22690 1727204275.24047: _execute() done 22690 1727204275.24051: dumping result to json 22690 1727204275.24054: done dumping result, returning 22690 1727204275.24063: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [127b8e07-fff9-78bb-bf56-000000000066] 22690 1727204275.24070: sending task result for task 127b8e07-fff9-78bb-bf56-000000000066 22690 1727204275.24271: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000066 22690 1727204275.24275: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204275.24328: no more pending results, returning what we have 22690 1727204275.24333: results queue empty 22690 1727204275.24335: checking for any_errors_fatal 22690 1727204275.24344: done checking for any_errors_fatal 22690 1727204275.24345: checking for max_fail_percentage 22690 1727204275.24348: done checking for max_fail_percentage 22690 1727204275.24348: checking to see if all hosts have failed and the running result is not ok 22690 1727204275.24349: done checking to see if all hosts have failed 22690 1727204275.24350: getting the remaining hosts for this loop 22690 1727204275.24352: done getting the remaining hosts for this loop 22690 1727204275.24357: getting the next task for host managed-node2 22690 1727204275.24363: done getting next task for host managed-node2 22690 1727204275.24372: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204275.24375: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204275.24394: getting variables 22690 1727204275.24397: in VariableManager get_vars() 22690 1727204275.24443: Calling all_inventory to load vars for managed-node2 22690 1727204275.24447: Calling groups_inventory to load vars for managed-node2 22690 1727204275.24449: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204275.24738: Calling all_plugins_play to load vars for managed-node2 22690 1727204275.24744: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204275.24749: Calling groups_plugins_play to load vars for managed-node2 22690 1727204275.28902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204275.34060: done with get_vars() 22690 1727204275.34108: done getting variables 22690 1727204275.34390: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.137) 0:00:42.627 ***** 22690 1727204275.34428: entering _queue_task() for managed-node2/service 22690 1727204275.35237: worker is 1 (out of 1 available) 22690 1727204275.35253: exiting _queue_task() for managed-node2/service 22690 1727204275.35271: done queuing things up, now waiting for results queue to drain 22690 1727204275.35273: waiting for pending results... 22690 1727204275.35988: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22690 1727204275.36124: in run() - task 127b8e07-fff9-78bb-bf56-000000000067 22690 1727204275.36329: variable 'ansible_search_path' from source: unknown 22690 1727204275.36334: variable 'ansible_search_path' from source: unknown 22690 1727204275.36337: calling self._execute() 22690 1727204275.36527: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.36875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.36880: variable 'omit' from source: magic vars 22690 1727204275.37555: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.37581: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204275.37845: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204275.38416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204275.41570: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204275.41679: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204275.41738: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204275.41783: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204275.41823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204275.41920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.41969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.42005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.42061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.42086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.42152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.42186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.42217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.42276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.42298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.42394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.42422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.42447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.42499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.42518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.42836: variable 'network_connections' from source: play vars 22690 1727204275.42972: variable 'profile' from source: play vars 22690 1727204275.43021: variable 'profile' from source: play vars 22690 1727204275.43091: variable 'interface' from source: set_fact 22690 1727204275.43293: variable 'interface' from source: set_fact 22690 1727204275.43572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204275.43892: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204275.44017: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204275.44108: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204275.44187: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204275.44315: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204275.44354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204275.44549: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.44554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204275.44627: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204275.45353: variable 'network_connections' from source: play vars 22690 1727204275.45375: variable 'profile' from source: play vars 22690 1727204275.45604: variable 'profile' from source: play vars 22690 1727204275.45606: variable 'interface' from source: set_fact 22690 1727204275.45692: variable 'interface' from source: set_fact 22690 1727204275.45737: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22690 1727204275.45753: when evaluation is False, skipping this task 22690 1727204275.45778: _execute() done 22690 1727204275.45795: dumping result to json 22690 1727204275.45842: done dumping result, returning 22690 1727204275.45928: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [127b8e07-fff9-78bb-bf56-000000000067] 22690 1727204275.45942: sending task result for task 127b8e07-fff9-78bb-bf56-000000000067 22690 1727204275.46019: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000067 22690 1727204275.46023: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22690 1727204275.46093: no more pending results, returning what we have 22690 1727204275.46096: results queue empty 22690 1727204275.46098: checking for any_errors_fatal 22690 1727204275.46105: done checking for any_errors_fatal 22690 1727204275.46106: checking for max_fail_percentage 22690 1727204275.46108: done checking for max_fail_percentage 22690 1727204275.46109: checking to see if all hosts have failed and the running result is not ok 22690 1727204275.46110: done checking to see if all hosts have failed 22690 1727204275.46111: getting the remaining hosts for this loop 22690 1727204275.46112: done getting the remaining hosts for this loop 22690 1727204275.46117: getting the next task for host managed-node2 22690 1727204275.46124: done getting next task for host managed-node2 22690 1727204275.46129: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204275.46131: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204275.46148: getting variables 22690 1727204275.46150: in VariableManager get_vars() 22690 1727204275.46321: Calling all_inventory to load vars for managed-node2 22690 1727204275.46324: Calling groups_inventory to load vars for managed-node2 22690 1727204275.46327: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204275.46340: Calling all_plugins_play to load vars for managed-node2 22690 1727204275.46345: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204275.46348: Calling groups_plugins_play to load vars for managed-node2 22690 1727204275.50302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204275.55470: done with get_vars() 22690 1727204275.55507: done getting variables 22690 1727204275.55729: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:55 -0400 (0:00:00.214) 0:00:42.842 ***** 22690 1727204275.55881: entering _queue_task() for managed-node2/service 22690 1727204275.57041: worker is 1 (out of 1 available) 22690 1727204275.57054: exiting _queue_task() for managed-node2/service 22690 1727204275.57070: done queuing things up, now waiting for results queue to drain 22690 1727204275.57072: waiting for pending results... 22690 1727204275.57735: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22690 1727204275.58041: in run() - task 127b8e07-fff9-78bb-bf56-000000000068 22690 1727204275.58046: variable 'ansible_search_path' from source: unknown 22690 1727204275.58050: variable 'ansible_search_path' from source: unknown 22690 1727204275.58078: calling self._execute() 22690 1727204275.58306: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.58320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.58366: variable 'omit' from source: magic vars 22690 1727204275.58804: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.58826: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204275.59025: variable 'network_provider' from source: set_fact 22690 1727204275.59129: variable 'network_state' from source: role '' defaults 22690 1727204275.59133: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22690 1727204275.59135: variable 'omit' from source: magic vars 22690 1727204275.59137: variable 'omit' from source: magic vars 22690 1727204275.59155: variable 'network_service_name' from source: role '' defaults 22690 1727204275.59246: variable 'network_service_name' from source: role '' defaults 22690 1727204275.59377: variable '__network_provider_setup' from source: role '' defaults 22690 1727204275.59388: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204275.59464: variable '__network_service_name_default_nm' from source: role '' defaults 22690 1727204275.59482: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204275.59551: variable '__network_packages_default_nm' from source: role '' defaults 22690 1727204275.59816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204275.62878: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204275.62976: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204275.63024: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204275.63072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204275.63145: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204275.63211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.63257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.63293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.63344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.63373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.63471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.63476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.63494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.63541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.63562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.63837: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22690 1727204275.63987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.64019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.64127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.64131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.64133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.64216: variable 'ansible_python' from source: facts 22690 1727204275.64248: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22690 1727204275.64348: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204275.64529: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204275.64689: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.64711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.64741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.64787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.64946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.64951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204275.64962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204275.64967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.64970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204275.65371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204275.65375: variable 'network_connections' from source: play vars 22690 1727204275.65378: variable 'profile' from source: play vars 22690 1727204275.65380: variable 'profile' from source: play vars 22690 1727204275.65383: variable 'interface' from source: set_fact 22690 1727204275.65386: variable 'interface' from source: set_fact 22690 1727204275.65434: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204275.65685: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204275.65743: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204275.65788: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204275.65854: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204275.65901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204275.65937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204275.66101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204275.66141: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204275.66200: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204275.66991: variable 'network_connections' from source: play vars 22690 1727204275.66998: variable 'profile' from source: play vars 22690 1727204275.67208: variable 'profile' from source: play vars 22690 1727204275.67212: variable 'interface' from source: set_fact 22690 1727204275.67404: variable 'interface' from source: set_fact 22690 1727204275.67446: variable '__network_packages_default_wireless' from source: role '' defaults 22690 1727204275.67641: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204275.67999: variable 'network_connections' from source: play vars 22690 1727204275.68008: variable 'profile' from source: play vars 22690 1727204275.68089: variable 'profile' from source: play vars 22690 1727204275.68093: variable 'interface' from source: set_fact 22690 1727204275.68179: variable 'interface' from source: set_fact 22690 1727204275.68210: variable '__network_packages_default_team' from source: role '' defaults 22690 1727204275.68303: variable '__network_team_connections_defined' from source: role '' defaults 22690 1727204275.68647: variable 'network_connections' from source: play vars 22690 1727204275.68650: variable 'profile' from source: play vars 22690 1727204275.68772: variable 'profile' from source: play vars 22690 1727204275.68778: variable 'interface' from source: set_fact 22690 1727204275.68816: variable 'interface' from source: set_fact 22690 1727204275.68888: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204275.68954: variable '__network_service_name_default_initscripts' from source: role '' defaults 22690 1727204275.68991: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204275.69030: variable '__network_packages_default_initscripts' from source: role '' defaults 22690 1727204275.69273: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22690 1727204275.70012: variable 'network_connections' from source: play vars 22690 1727204275.70016: variable 'profile' from source: play vars 22690 1727204275.70023: variable 'profile' from source: play vars 22690 1727204275.70028: variable 'interface' from source: set_fact 22690 1727204275.70117: variable 'interface' from source: set_fact 22690 1727204275.70125: variable 'ansible_distribution' from source: facts 22690 1727204275.70171: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.70175: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.70177: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22690 1727204275.70373: variable 'ansible_distribution' from source: facts 22690 1727204275.70376: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.70379: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.70381: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22690 1727204275.70571: variable 'ansible_distribution' from source: facts 22690 1727204275.70575: variable '__network_rh_distros' from source: role '' defaults 22690 1727204275.70578: variable 'ansible_distribution_major_version' from source: facts 22690 1727204275.70663: variable 'network_provider' from source: set_fact 22690 1727204275.70668: variable 'omit' from source: magic vars 22690 1727204275.70684: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204275.70774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204275.70777: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204275.70780: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204275.70783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204275.70808: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204275.70811: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.70814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.70954: Set connection var ansible_connection to ssh 22690 1727204275.70958: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204275.70960: Set connection var ansible_pipelining to False 22690 1727204275.70963: Set connection var ansible_shell_type to sh 22690 1727204275.70966: Set connection var ansible_shell_executable to /bin/sh 22690 1727204275.70970: Set connection var ansible_timeout to 10 22690 1727204275.71173: variable 'ansible_shell_executable' from source: unknown 22690 1727204275.71176: variable 'ansible_connection' from source: unknown 22690 1727204275.71179: variable 'ansible_module_compression' from source: unknown 22690 1727204275.71181: variable 'ansible_shell_type' from source: unknown 22690 1727204275.71184: variable 'ansible_shell_executable' from source: unknown 22690 1727204275.71186: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204275.71193: variable 'ansible_pipelining' from source: unknown 22690 1727204275.71195: variable 'ansible_timeout' from source: unknown 22690 1727204275.71199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204275.71203: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204275.71206: variable 'omit' from source: magic vars 22690 1727204275.71208: starting attempt loop 22690 1727204275.71211: running the handler 22690 1727204275.71261: variable 'ansible_facts' from source: unknown 22690 1727204275.72677: _low_level_execute_command(): starting 22690 1727204275.72683: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204275.73575: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204275.73579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204275.73582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204275.73781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204275.75527: stdout chunk (state=3): >>>/root <<< 22690 1727204275.75784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204275.75788: stdout chunk (state=3): >>><<< 22690 1727204275.75790: stderr chunk (state=3): >>><<< 22690 1727204275.75793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204275.75796: _low_level_execute_command(): starting 22690 1727204275.75800: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048 `" && echo ansible-tmp-1727204275.7575738-25591-230746408545048="` echo /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048 `" ) && sleep 0' 22690 1727204275.76513: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204275.76518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204275.76521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204275.76523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204275.76525: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204275.76527: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204275.76529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204275.76769: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204275.76986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204275.77175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204275.79223: stdout chunk (state=3): >>>ansible-tmp-1727204275.7575738-25591-230746408545048=/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048 <<< 22690 1727204275.79473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204275.79478: stderr chunk (state=3): >>><<< 22690 1727204275.79481: stdout chunk (state=3): >>><<< 22690 1727204275.79484: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204275.7575738-25591-230746408545048=/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204275.79486: variable 'ansible_module_compression' from source: unknown 22690 1727204275.79539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22690 1727204275.79611: variable 'ansible_facts' from source: unknown 22690 1727204275.79830: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py 22690 1727204275.79979: Sending initial data 22690 1727204275.79982: Sent initial data (156 bytes) 22690 1727204275.80791: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204275.80880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204275.80911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204275.81005: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204275.82636: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204275.82729: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204275.82800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpvh74xr9g /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py <<< 22690 1727204275.82819: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py" <<< 22690 1727204275.82889: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpvh74xr9g" to remote "/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py" <<< 22690 1727204275.84917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204275.85076: stderr chunk (state=3): >>><<< 22690 1727204275.85079: stdout chunk (state=3): >>><<< 22690 1727204275.85081: done transferring module to remote 22690 1727204275.85083: _low_level_execute_command(): starting 22690 1727204275.85085: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/ /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py && sleep 0' 22690 1727204275.85699: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204275.85787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204275.85843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204275.85871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204275.85910: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204275.85983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204275.87939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204275.87943: stdout chunk (state=3): >>><<< 22690 1727204275.87946: stderr chunk (state=3): >>><<< 22690 1727204275.87974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204275.87977: _low_level_execute_command(): starting 22690 1727204275.88060: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/AnsiballZ_systemd.py && sleep 0' 22690 1727204275.88731: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204275.88757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204275.88860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204275.88914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204275.88948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204275.88983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204275.89099: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.21175: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4485120", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3535568896", "CPUUsageNSec": "937981000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCO<<< 22690 1727204276.21195: stdout chunk (state=3): >>>RE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22690 1727204276.22933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.22951: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204276.23054: stderr chunk (state=3): >>><<< 22690 1727204276.23075: stdout chunk (state=3): >>><<< 22690 1727204276.23100: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3396", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ExecMainStartTimestampMonotonic": "260891109", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3396", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5305", "MemoryCurrent": "4485120", "MemoryPeak": "6385664", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3535568896", "CPUUsageNSec": "937981000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4416", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14721", "LimitNPROCSoft": "14721", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14721", "LimitSIGPENDINGSoft": "14721", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target multi-user.target network.service shutdown.target", "After": "systemd-journald.socket network-pre.target dbus-broker.service dbus.socket sysinit.target basic.target cloud-init-local.service system.slice", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:55 EDT", "StateChangeTimestampMonotonic": "382184288", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveExitTimestampMonotonic": "260891359", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveEnterTimestampMonotonic": "260977925", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ActiveExitTimestampMonotonic": "260855945", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:54 EDT", "InactiveEnterTimestampMonotonic": "260882383", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:54 EDT", "ConditionTimestampMonotonic": "260884363", "AssertTimestamp": "Tue 2024-09-24 14:51:54 EDT", "AssertTimestampMonotonic": "260884375", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "16e01f988f204339aa5cb89910a771d1", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204276.23326: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204276.23353: _low_level_execute_command(): starting 22690 1727204276.23364: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204275.7575738-25591-230746408545048/ > /dev/null 2>&1 && sleep 0' 22690 1727204276.24022: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204276.24039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204276.24052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204276.24072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.24087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204276.24097: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204276.24108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.24124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204276.24134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204276.24145: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204276.24155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204276.24172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204276.24193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.24203: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204276.24212: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204276.24224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.24302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204276.24318: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.24342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.24438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.26449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.26523: stdout chunk (state=3): >>><<< 22690 1727204276.26539: stderr chunk (state=3): >>><<< 22690 1727204276.26561: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204276.26577: handler run complete 22690 1727204276.26775: attempt loop complete, returning result 22690 1727204276.26779: _execute() done 22690 1727204276.26782: dumping result to json 22690 1727204276.26784: done dumping result, returning 22690 1727204276.26786: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [127b8e07-fff9-78bb-bf56-000000000068] 22690 1727204276.26789: sending task result for task 127b8e07-fff9-78bb-bf56-000000000068 22690 1727204276.27227: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000068 22690 1727204276.27232: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204276.27288: no more pending results, returning what we have 22690 1727204276.27292: results queue empty 22690 1727204276.27293: checking for any_errors_fatal 22690 1727204276.27303: done checking for any_errors_fatal 22690 1727204276.27304: checking for max_fail_percentage 22690 1727204276.27305: done checking for max_fail_percentage 22690 1727204276.27306: checking to see if all hosts have failed and the running result is not ok 22690 1727204276.27307: done checking to see if all hosts have failed 22690 1727204276.27308: getting the remaining hosts for this loop 22690 1727204276.27310: done getting the remaining hosts for this loop 22690 1727204276.27317: getting the next task for host managed-node2 22690 1727204276.27323: done getting next task for host managed-node2 22690 1727204276.27327: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204276.27329: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204276.27340: getting variables 22690 1727204276.27342: in VariableManager get_vars() 22690 1727204276.27379: Calling all_inventory to load vars for managed-node2 22690 1727204276.27382: Calling groups_inventory to load vars for managed-node2 22690 1727204276.27384: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204276.27395: Calling all_plugins_play to load vars for managed-node2 22690 1727204276.27398: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204276.27401: Calling groups_plugins_play to load vars for managed-node2 22690 1727204276.29226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204276.31659: done with get_vars() 22690 1727204276.31707: done getting variables 22690 1727204276.31783: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.759) 0:00:43.601 ***** 22690 1727204276.31831: entering _queue_task() for managed-node2/service 22690 1727204276.32235: worker is 1 (out of 1 available) 22690 1727204276.32476: exiting _queue_task() for managed-node2/service 22690 1727204276.32489: done queuing things up, now waiting for results queue to drain 22690 1727204276.32491: waiting for pending results... 22690 1727204276.32603: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22690 1727204276.32738: in run() - task 127b8e07-fff9-78bb-bf56-000000000069 22690 1727204276.32756: variable 'ansible_search_path' from source: unknown 22690 1727204276.32761: variable 'ansible_search_path' from source: unknown 22690 1727204276.32806: calling self._execute() 22690 1727204276.32968: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.32973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.32977: variable 'omit' from source: magic vars 22690 1727204276.33429: variable 'ansible_distribution_major_version' from source: facts 22690 1727204276.33442: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204276.33584: variable 'network_provider' from source: set_fact 22690 1727204276.33616: Evaluated conditional (network_provider == "nm"): True 22690 1727204276.33707: variable '__network_wpa_supplicant_required' from source: role '' defaults 22690 1727204276.33814: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22690 1727204276.34171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204276.36983: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204276.37111: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204276.37115: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204276.37148: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204276.37370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204276.37374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204276.37377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204276.37380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204276.37470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204276.37474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204276.37477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204276.37487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204276.37516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204276.37562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204276.37577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204276.37629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204276.37652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204276.37679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204276.37719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204276.37742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204276.37970: variable 'network_connections' from source: play vars 22690 1727204276.37974: variable 'profile' from source: play vars 22690 1727204276.38000: variable 'profile' from source: play vars 22690 1727204276.38004: variable 'interface' from source: set_fact 22690 1727204276.38081: variable 'interface' from source: set_fact 22690 1727204276.38169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22690 1727204276.38380: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22690 1727204276.38432: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22690 1727204276.38462: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22690 1727204276.38494: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22690 1727204276.38543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22690 1727204276.38563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22690 1727204276.38588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204276.38623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22690 1727204276.38673: variable '__network_wireless_connections_defined' from source: role '' defaults 22690 1727204276.38985: variable 'network_connections' from source: play vars 22690 1727204276.38992: variable 'profile' from source: play vars 22690 1727204276.39073: variable 'profile' from source: play vars 22690 1727204276.39076: variable 'interface' from source: set_fact 22690 1727204276.39142: variable 'interface' from source: set_fact 22690 1727204276.39183: Evaluated conditional (__network_wpa_supplicant_required): False 22690 1727204276.39187: when evaluation is False, skipping this task 22690 1727204276.39190: _execute() done 22690 1727204276.39202: dumping result to json 22690 1727204276.39205: done dumping result, returning 22690 1727204276.39208: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [127b8e07-fff9-78bb-bf56-000000000069] 22690 1727204276.39211: sending task result for task 127b8e07-fff9-78bb-bf56-000000000069 22690 1727204276.39316: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000069 22690 1727204276.39320: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22690 1727204276.39583: no more pending results, returning what we have 22690 1727204276.39587: results queue empty 22690 1727204276.39588: checking for any_errors_fatal 22690 1727204276.39611: done checking for any_errors_fatal 22690 1727204276.39612: checking for max_fail_percentage 22690 1727204276.39617: done checking for max_fail_percentage 22690 1727204276.39618: checking to see if all hosts have failed and the running result is not ok 22690 1727204276.39619: done checking to see if all hosts have failed 22690 1727204276.39620: getting the remaining hosts for this loop 22690 1727204276.39622: done getting the remaining hosts for this loop 22690 1727204276.39626: getting the next task for host managed-node2 22690 1727204276.39633: done getting next task for host managed-node2 22690 1727204276.39637: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204276.39639: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204276.39653: getting variables 22690 1727204276.39655: in VariableManager get_vars() 22690 1727204276.39700: Calling all_inventory to load vars for managed-node2 22690 1727204276.39703: Calling groups_inventory to load vars for managed-node2 22690 1727204276.39706: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204276.39721: Calling all_plugins_play to load vars for managed-node2 22690 1727204276.39725: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204276.39728: Calling groups_plugins_play to load vars for managed-node2 22690 1727204276.41870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204276.44103: done with get_vars() 22690 1727204276.44152: done getting variables 22690 1727204276.44221: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.124) 0:00:43.725 ***** 22690 1727204276.44259: entering _queue_task() for managed-node2/service 22690 1727204276.44778: worker is 1 (out of 1 available) 22690 1727204276.44792: exiting _queue_task() for managed-node2/service 22690 1727204276.44806: done queuing things up, now waiting for results queue to drain 22690 1727204276.44808: waiting for pending results... 22690 1727204276.45258: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22690 1727204276.45264: in run() - task 127b8e07-fff9-78bb-bf56-00000000006a 22690 1727204276.45270: variable 'ansible_search_path' from source: unknown 22690 1727204276.45273: variable 'ansible_search_path' from source: unknown 22690 1727204276.45276: calling self._execute() 22690 1727204276.45394: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.45398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.45429: variable 'omit' from source: magic vars 22690 1727204276.45892: variable 'ansible_distribution_major_version' from source: facts 22690 1727204276.45896: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204276.46088: variable 'network_provider' from source: set_fact 22690 1727204276.46091: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204276.46095: when evaluation is False, skipping this task 22690 1727204276.46099: _execute() done 22690 1727204276.46106: dumping result to json 22690 1727204276.46110: done dumping result, returning 22690 1727204276.46120: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [127b8e07-fff9-78bb-bf56-00000000006a] 22690 1727204276.46123: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006a 22690 1727204276.46196: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006a 22690 1727204276.46199: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22690 1727204276.46276: no more pending results, returning what we have 22690 1727204276.46282: results queue empty 22690 1727204276.46284: checking for any_errors_fatal 22690 1727204276.46373: done checking for any_errors_fatal 22690 1727204276.46374: checking for max_fail_percentage 22690 1727204276.46377: done checking for max_fail_percentage 22690 1727204276.46378: checking to see if all hosts have failed and the running result is not ok 22690 1727204276.46379: done checking to see if all hosts have failed 22690 1727204276.46380: getting the remaining hosts for this loop 22690 1727204276.46381: done getting the remaining hosts for this loop 22690 1727204276.46386: getting the next task for host managed-node2 22690 1727204276.46394: done getting next task for host managed-node2 22690 1727204276.46399: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204276.46404: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204276.46427: getting variables 22690 1727204276.46429: in VariableManager get_vars() 22690 1727204276.46592: Calling all_inventory to load vars for managed-node2 22690 1727204276.46595: Calling groups_inventory to load vars for managed-node2 22690 1727204276.46598: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204276.46608: Calling all_plugins_play to load vars for managed-node2 22690 1727204276.46611: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204276.46618: Calling groups_plugins_play to load vars for managed-node2 22690 1727204276.50412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204276.55234: done with get_vars() 22690 1727204276.55388: done getting variables 22690 1727204276.55462: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.112) 0:00:43.838 ***** 22690 1727204276.55503: entering _queue_task() for managed-node2/copy 22690 1727204276.56378: worker is 1 (out of 1 available) 22690 1727204276.56393: exiting _queue_task() for managed-node2/copy 22690 1727204276.56406: done queuing things up, now waiting for results queue to drain 22690 1727204276.56407: waiting for pending results... 22690 1727204276.57085: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22690 1727204276.57091: in run() - task 127b8e07-fff9-78bb-bf56-00000000006b 22690 1727204276.57095: variable 'ansible_search_path' from source: unknown 22690 1727204276.57099: variable 'ansible_search_path' from source: unknown 22690 1727204276.57104: calling self._execute() 22690 1727204276.57107: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.57111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.57115: variable 'omit' from source: magic vars 22690 1727204276.57414: variable 'ansible_distribution_major_version' from source: facts 22690 1727204276.57433: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204276.57579: variable 'network_provider' from source: set_fact 22690 1727204276.57585: Evaluated conditional (network_provider == "initscripts"): False 22690 1727204276.57589: when evaluation is False, skipping this task 22690 1727204276.57592: _execute() done 22690 1727204276.57597: dumping result to json 22690 1727204276.57600: done dumping result, returning 22690 1727204276.57611: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [127b8e07-fff9-78bb-bf56-00000000006b] 22690 1727204276.57617: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006b 22690 1727204276.57731: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006b 22690 1727204276.57736: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22690 1727204276.57791: no more pending results, returning what we have 22690 1727204276.57797: results queue empty 22690 1727204276.57798: checking for any_errors_fatal 22690 1727204276.57805: done checking for any_errors_fatal 22690 1727204276.57806: checking for max_fail_percentage 22690 1727204276.57808: done checking for max_fail_percentage 22690 1727204276.57809: checking to see if all hosts have failed and the running result is not ok 22690 1727204276.57809: done checking to see if all hosts have failed 22690 1727204276.57810: getting the remaining hosts for this loop 22690 1727204276.57812: done getting the remaining hosts for this loop 22690 1727204276.57819: getting the next task for host managed-node2 22690 1727204276.57825: done getting next task for host managed-node2 22690 1727204276.57829: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204276.57831: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204276.57849: getting variables 22690 1727204276.57851: in VariableManager get_vars() 22690 1727204276.57901: Calling all_inventory to load vars for managed-node2 22690 1727204276.57904: Calling groups_inventory to load vars for managed-node2 22690 1727204276.57906: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204276.57925: Calling all_plugins_play to load vars for managed-node2 22690 1727204276.57929: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204276.57933: Calling groups_plugins_play to load vars for managed-node2 22690 1727204276.60064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204276.62373: done with get_vars() 22690 1727204276.62422: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.070) 0:00:43.908 ***** 22690 1727204276.62529: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204276.62952: worker is 1 (out of 1 available) 22690 1727204276.62968: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22690 1727204276.63095: done queuing things up, now waiting for results queue to drain 22690 1727204276.63096: waiting for pending results... 22690 1727204276.63488: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22690 1727204276.63495: in run() - task 127b8e07-fff9-78bb-bf56-00000000006c 22690 1727204276.63498: variable 'ansible_search_path' from source: unknown 22690 1727204276.63502: variable 'ansible_search_path' from source: unknown 22690 1727204276.63506: calling self._execute() 22690 1727204276.63610: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.63615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.63886: variable 'omit' from source: magic vars 22690 1727204276.64076: variable 'ansible_distribution_major_version' from source: facts 22690 1727204276.64090: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204276.64098: variable 'omit' from source: magic vars 22690 1727204276.64153: variable 'omit' from source: magic vars 22690 1727204276.64326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22690 1727204276.67367: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22690 1727204276.67447: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22690 1727204276.67488: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22690 1727204276.67535: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22690 1727204276.67562: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22690 1727204276.67656: variable 'network_provider' from source: set_fact 22690 1727204276.67809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22690 1727204276.67849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22690 1727204276.67880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22690 1727204276.67927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22690 1727204276.67950: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22690 1727204276.68035: variable 'omit' from source: magic vars 22690 1727204276.68177: variable 'omit' from source: magic vars 22690 1727204276.68293: variable 'network_connections' from source: play vars 22690 1727204276.68321: variable 'profile' from source: play vars 22690 1727204276.68424: variable 'profile' from source: play vars 22690 1727204276.68428: variable 'interface' from source: set_fact 22690 1727204276.68455: variable 'interface' from source: set_fact 22690 1727204276.68625: variable 'omit' from source: magic vars 22690 1727204276.68634: variable '__lsr_ansible_managed' from source: task vars 22690 1727204276.68706: variable '__lsr_ansible_managed' from source: task vars 22690 1727204276.69472: Loaded config def from plugin (lookup/template) 22690 1727204276.69476: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22690 1727204276.69479: File lookup term: get_ansible_managed.j2 22690 1727204276.69482: variable 'ansible_search_path' from source: unknown 22690 1727204276.69485: evaluation_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22690 1727204276.69488: search_path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22690 1727204276.69491: variable 'ansible_search_path' from source: unknown 22690 1727204276.77113: variable 'ansible_managed' from source: unknown 22690 1727204276.77298: variable 'omit' from source: magic vars 22690 1727204276.77342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204276.77371: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204276.77392: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204276.77416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204276.77437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204276.77470: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204276.77474: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.77477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.77591: Set connection var ansible_connection to ssh 22690 1727204276.77604: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204276.77613: Set connection var ansible_pipelining to False 22690 1727204276.77616: Set connection var ansible_shell_type to sh 22690 1727204276.77628: Set connection var ansible_shell_executable to /bin/sh 22690 1727204276.77642: Set connection var ansible_timeout to 10 22690 1727204276.77671: variable 'ansible_shell_executable' from source: unknown 22690 1727204276.77674: variable 'ansible_connection' from source: unknown 22690 1727204276.77677: variable 'ansible_module_compression' from source: unknown 22690 1727204276.77680: variable 'ansible_shell_type' from source: unknown 22690 1727204276.77683: variable 'ansible_shell_executable' from source: unknown 22690 1727204276.77685: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204276.77690: variable 'ansible_pipelining' from source: unknown 22690 1727204276.77693: variable 'ansible_timeout' from source: unknown 22690 1727204276.77733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204276.77869: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204276.77882: variable 'omit' from source: magic vars 22690 1727204276.77885: starting attempt loop 22690 1727204276.77888: running the handler 22690 1727204276.77951: _low_level_execute_command(): starting 22690 1727204276.77954: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204276.78655: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.78663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.78680: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204276.78686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.78700: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found <<< 22690 1727204276.78708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.78783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.78820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.78924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.80693: stdout chunk (state=3): >>>/root <<< 22690 1727204276.80905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.80909: stdout chunk (state=3): >>><<< 22690 1727204276.80912: stderr chunk (state=3): >>><<< 22690 1727204276.80972: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204276.80976: _low_level_execute_command(): starting 22690 1727204276.80980: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451 `" && echo ansible-tmp-1727204276.8093944-25634-105944322596451="` echo /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451 `" ) && sleep 0' 22690 1727204276.81647: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204276.81656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204276.81745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204276.81748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.81751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204276.81753: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204276.81755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.81757: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204276.81815: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.81858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.81928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.83918: stdout chunk (state=3): >>>ansible-tmp-1727204276.8093944-25634-105944322596451=/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451 <<< 22690 1727204276.84138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.84143: stdout chunk (state=3): >>><<< 22690 1727204276.84145: stderr chunk (state=3): >>><<< 22690 1727204276.84273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204276.8093944-25634-105944322596451=/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204276.84279: variable 'ansible_module_compression' from source: unknown 22690 1727204276.84290: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22690 1727204276.84327: variable 'ansible_facts' from source: unknown 22690 1727204276.84453: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py 22690 1727204276.84632: Sending initial data 22690 1727204276.84642: Sent initial data (168 bytes) 22690 1727204276.85334: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204276.85390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204276.85485: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.85513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.85632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.87260: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204276.87343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204276.87434: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpnet_kq9_ /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py <<< 22690 1727204276.87438: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py" <<< 22690 1727204276.87499: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpnet_kq9_" to remote "/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py" <<< 22690 1727204276.88996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.89000: stdout chunk (state=3): >>><<< 22690 1727204276.89002: stderr chunk (state=3): >>><<< 22690 1727204276.89004: done transferring module to remote 22690 1727204276.89006: _low_level_execute_command(): starting 22690 1727204276.89008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/ /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py && sleep 0' 22690 1727204276.89691: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204276.89762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.89820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204276.89844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.89867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.89978: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204276.91901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204276.91906: stdout chunk (state=3): >>><<< 22690 1727204276.91912: stderr chunk (state=3): >>><<< 22690 1727204276.91935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204276.91939: _low_level_execute_command(): starting 22690 1727204276.91941: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/AnsiballZ_network_connections.py && sleep 0' 22690 1727204276.92648: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204276.92652: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204276.92655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204276.92658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204276.92673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204276.92679: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204276.92689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.92754: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204276.92757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204276.92760: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204276.92779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204276.92821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204276.92841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204276.92873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204276.92983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.21913: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tp_9tkn0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tp_9tkn0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/5a4ab182-4b5b-42b9-9199-87bcb8efcb93: error=unknown <<< 22690 1727204277.22044: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22690 1727204277.23942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204277.23948: stdout chunk (state=3): >>><<< 22690 1727204277.23958: stderr chunk (state=3): >>><<< 22690 1727204277.24095: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tp_9tkn0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_tp_9tkn0/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/5a4ab182-4b5b-42b9-9199-87bcb8efcb93: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204277.24099: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204277.24102: _low_level_execute_command(): starting 22690 1727204277.24105: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204276.8093944-25634-105944322596451/ > /dev/null 2>&1 && sleep 0' 22690 1727204277.25380: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204277.25384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204277.25387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204277.25389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204277.25485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204277.25489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.25572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.27644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204277.27648: stdout chunk (state=3): >>><<< 22690 1727204277.27651: stderr chunk (state=3): >>><<< 22690 1727204277.27677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204277.27785: handler run complete 22690 1727204277.27791: attempt loop complete, returning result 22690 1727204277.27794: _execute() done 22690 1727204277.27797: dumping result to json 22690 1727204277.27799: done dumping result, returning 22690 1727204277.27801: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [127b8e07-fff9-78bb-bf56-00000000006c] 22690 1727204277.27803: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006c 22690 1727204277.28256: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006c 22690 1727204277.28260: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22690 1727204277.28363: no more pending results, returning what we have 22690 1727204277.28368: results queue empty 22690 1727204277.28369: checking for any_errors_fatal 22690 1727204277.28377: done checking for any_errors_fatal 22690 1727204277.28378: checking for max_fail_percentage 22690 1727204277.28380: done checking for max_fail_percentage 22690 1727204277.28380: checking to see if all hosts have failed and the running result is not ok 22690 1727204277.28381: done checking to see if all hosts have failed 22690 1727204277.28382: getting the remaining hosts for this loop 22690 1727204277.28384: done getting the remaining hosts for this loop 22690 1727204277.28387: getting the next task for host managed-node2 22690 1727204277.28394: done getting next task for host managed-node2 22690 1727204277.28399: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204277.28401: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204277.28412: getting variables 22690 1727204277.28414: in VariableManager get_vars() 22690 1727204277.28454: Calling all_inventory to load vars for managed-node2 22690 1727204277.28457: Calling groups_inventory to load vars for managed-node2 22690 1727204277.28459: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204277.28500: Calling all_plugins_play to load vars for managed-node2 22690 1727204277.28505: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204277.28510: Calling groups_plugins_play to load vars for managed-node2 22690 1727204277.31669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204277.34759: done with get_vars() 22690 1727204277.34803: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.723) 0:00:44.632 ***** 22690 1727204277.34909: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204277.35423: worker is 1 (out of 1 available) 22690 1727204277.35437: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22690 1727204277.35450: done queuing things up, now waiting for results queue to drain 22690 1727204277.35451: waiting for pending results... 22690 1727204277.35751: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22690 1727204277.35815: in run() - task 127b8e07-fff9-78bb-bf56-00000000006d 22690 1727204277.35841: variable 'ansible_search_path' from source: unknown 22690 1727204277.35854: variable 'ansible_search_path' from source: unknown 22690 1727204277.35910: calling self._execute() 22690 1727204277.36045: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.36073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.36171: variable 'omit' from source: magic vars 22690 1727204277.36641: variable 'ansible_distribution_major_version' from source: facts 22690 1727204277.36661: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204277.36822: variable 'network_state' from source: role '' defaults 22690 1727204277.36847: Evaluated conditional (network_state != {}): False 22690 1727204277.36859: when evaluation is False, skipping this task 22690 1727204277.36871: _execute() done 22690 1727204277.36880: dumping result to json 22690 1727204277.36889: done dumping result, returning 22690 1727204277.36901: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [127b8e07-fff9-78bb-bf56-00000000006d] 22690 1727204277.36912: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22690 1727204277.37195: no more pending results, returning what we have 22690 1727204277.37202: results queue empty 22690 1727204277.37203: checking for any_errors_fatal 22690 1727204277.37219: done checking for any_errors_fatal 22690 1727204277.37220: checking for max_fail_percentage 22690 1727204277.37222: done checking for max_fail_percentage 22690 1727204277.37224: checking to see if all hosts have failed and the running result is not ok 22690 1727204277.37225: done checking to see if all hosts have failed 22690 1727204277.37225: getting the remaining hosts for this loop 22690 1727204277.37227: done getting the remaining hosts for this loop 22690 1727204277.37232: getting the next task for host managed-node2 22690 1727204277.37240: done getting next task for host managed-node2 22690 1727204277.37245: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204277.37248: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204277.37384: getting variables 22690 1727204277.37387: in VariableManager get_vars() 22690 1727204277.37432: Calling all_inventory to load vars for managed-node2 22690 1727204277.37435: Calling groups_inventory to load vars for managed-node2 22690 1727204277.37437: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204277.37611: Calling all_plugins_play to load vars for managed-node2 22690 1727204277.37616: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204277.37620: Calling groups_plugins_play to load vars for managed-node2 22690 1727204277.38286: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006d 22690 1727204277.38292: WORKER PROCESS EXITING 22690 1727204277.40164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204277.42703: done with get_vars() 22690 1727204277.42746: done getting variables 22690 1727204277.42823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.079) 0:00:44.711 ***** 22690 1727204277.42859: entering _queue_task() for managed-node2/debug 22690 1727204277.43276: worker is 1 (out of 1 available) 22690 1727204277.43291: exiting _queue_task() for managed-node2/debug 22690 1727204277.43311: done queuing things up, now waiting for results queue to drain 22690 1727204277.43313: waiting for pending results... 22690 1727204277.43601: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22690 1727204277.43718: in run() - task 127b8e07-fff9-78bb-bf56-00000000006e 22690 1727204277.43740: variable 'ansible_search_path' from source: unknown 22690 1727204277.43744: variable 'ansible_search_path' from source: unknown 22690 1727204277.43787: calling self._execute() 22690 1727204277.43898: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.43904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.43915: variable 'omit' from source: magic vars 22690 1727204277.44362: variable 'ansible_distribution_major_version' from source: facts 22690 1727204277.44381: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204277.44394: variable 'omit' from source: magic vars 22690 1727204277.44445: variable 'omit' from source: magic vars 22690 1727204277.44493: variable 'omit' from source: magic vars 22690 1727204277.44542: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204277.44589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204277.44612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204277.44635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.44648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.44686: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204277.44694: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.44697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.44815: Set connection var ansible_connection to ssh 22690 1727204277.44829: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204277.44838: Set connection var ansible_pipelining to False 22690 1727204277.44841: Set connection var ansible_shell_type to sh 22690 1727204277.44847: Set connection var ansible_shell_executable to /bin/sh 22690 1727204277.44855: Set connection var ansible_timeout to 10 22690 1727204277.44885: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.44889: variable 'ansible_connection' from source: unknown 22690 1727204277.44892: variable 'ansible_module_compression' from source: unknown 22690 1727204277.44895: variable 'ansible_shell_type' from source: unknown 22690 1727204277.44898: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.44900: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.44902: variable 'ansible_pipelining' from source: unknown 22690 1727204277.44913: variable 'ansible_timeout' from source: unknown 22690 1727204277.44921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.45091: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204277.45157: variable 'omit' from source: magic vars 22690 1727204277.45161: starting attempt loop 22690 1727204277.45163: running the handler 22690 1727204277.45278: variable '__network_connections_result' from source: set_fact 22690 1727204277.45342: handler run complete 22690 1727204277.45377: attempt loop complete, returning result 22690 1727204277.45383: _execute() done 22690 1727204277.45387: dumping result to json 22690 1727204277.45389: done dumping result, returning 22690 1727204277.45392: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000006e] 22690 1727204277.45394: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006e ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 22690 1727204277.45636: no more pending results, returning what we have 22690 1727204277.45640: results queue empty 22690 1727204277.45641: checking for any_errors_fatal 22690 1727204277.45646: done checking for any_errors_fatal 22690 1727204277.45647: checking for max_fail_percentage 22690 1727204277.45648: done checking for max_fail_percentage 22690 1727204277.45649: checking to see if all hosts have failed and the running result is not ok 22690 1727204277.45650: done checking to see if all hosts have failed 22690 1727204277.45651: getting the remaining hosts for this loop 22690 1727204277.45653: done getting the remaining hosts for this loop 22690 1727204277.45656: getting the next task for host managed-node2 22690 1727204277.45662: done getting next task for host managed-node2 22690 1727204277.45667: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204277.45670: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204277.45794: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006e 22690 1727204277.45797: WORKER PROCESS EXITING 22690 1727204277.45802: getting variables 22690 1727204277.45804: in VariableManager get_vars() 22690 1727204277.45841: Calling all_inventory to load vars for managed-node2 22690 1727204277.45844: Calling groups_inventory to load vars for managed-node2 22690 1727204277.45846: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204277.45856: Calling all_plugins_play to load vars for managed-node2 22690 1727204277.45859: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204277.45862: Calling groups_plugins_play to load vars for managed-node2 22690 1727204277.47567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204277.49767: done with get_vars() 22690 1727204277.49815: done getting variables 22690 1727204277.49890: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.070) 0:00:44.782 ***** 22690 1727204277.49932: entering _queue_task() for managed-node2/debug 22690 1727204277.50568: worker is 1 (out of 1 available) 22690 1727204277.50581: exiting _queue_task() for managed-node2/debug 22690 1727204277.50592: done queuing things up, now waiting for results queue to drain 22690 1727204277.50594: waiting for pending results... 22690 1727204277.50714: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22690 1727204277.50860: in run() - task 127b8e07-fff9-78bb-bf56-00000000006f 22690 1727204277.50931: variable 'ansible_search_path' from source: unknown 22690 1727204277.50938: variable 'ansible_search_path' from source: unknown 22690 1727204277.51040: calling self._execute() 22690 1727204277.51076: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.51089: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.51106: variable 'omit' from source: magic vars 22690 1727204277.51546: variable 'ansible_distribution_major_version' from source: facts 22690 1727204277.51570: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204277.51591: variable 'omit' from source: magic vars 22690 1727204277.51649: variable 'omit' from source: magic vars 22690 1727204277.51703: variable 'omit' from source: magic vars 22690 1727204277.51754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204277.51819: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204277.51832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204277.51854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.51908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.51911: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204277.51916: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.51927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.52042: Set connection var ansible_connection to ssh 22690 1727204277.52059: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204277.52073: Set connection var ansible_pipelining to False 22690 1727204277.52080: Set connection var ansible_shell_type to sh 22690 1727204277.52089: Set connection var ansible_shell_executable to /bin/sh 22690 1727204277.52124: Set connection var ansible_timeout to 10 22690 1727204277.52135: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.52147: variable 'ansible_connection' from source: unknown 22690 1727204277.52154: variable 'ansible_module_compression' from source: unknown 22690 1727204277.52160: variable 'ansible_shell_type' from source: unknown 22690 1727204277.52233: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.52237: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.52239: variable 'ansible_pipelining' from source: unknown 22690 1727204277.52242: variable 'ansible_timeout' from source: unknown 22690 1727204277.52244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.52358: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204277.52379: variable 'omit' from source: magic vars 22690 1727204277.52389: starting attempt loop 22690 1727204277.52394: running the handler 22690 1727204277.52450: variable '__network_connections_result' from source: set_fact 22690 1727204277.52549: variable '__network_connections_result' from source: set_fact 22690 1727204277.52690: handler run complete 22690 1727204277.52775: attempt loop complete, returning result 22690 1727204277.52778: _execute() done 22690 1727204277.52781: dumping result to json 22690 1727204277.52783: done dumping result, returning 22690 1727204277.52786: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [127b8e07-fff9-78bb-bf56-00000000006f] 22690 1727204277.52788: sending task result for task 127b8e07-fff9-78bb-bf56-00000000006f 22690 1727204277.52982: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000006f ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22690 1727204277.53081: no more pending results, returning what we have 22690 1727204277.53085: results queue empty 22690 1727204277.53087: checking for any_errors_fatal 22690 1727204277.53096: done checking for any_errors_fatal 22690 1727204277.53097: checking for max_fail_percentage 22690 1727204277.53098: done checking for max_fail_percentage 22690 1727204277.53099: checking to see if all hosts have failed and the running result is not ok 22690 1727204277.53100: done checking to see if all hosts have failed 22690 1727204277.53101: getting the remaining hosts for this loop 22690 1727204277.53103: done getting the remaining hosts for this loop 22690 1727204277.53108: getting the next task for host managed-node2 22690 1727204277.53115: done getting next task for host managed-node2 22690 1727204277.53119: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204277.53122: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204277.53134: getting variables 22690 1727204277.53136: in VariableManager get_vars() 22690 1727204277.53297: Calling all_inventory to load vars for managed-node2 22690 1727204277.53300: Calling groups_inventory to load vars for managed-node2 22690 1727204277.53303: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204277.53377: Calling all_plugins_play to load vars for managed-node2 22690 1727204277.53382: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204277.53391: Calling groups_plugins_play to load vars for managed-node2 22690 1727204277.53404: WORKER PROCESS EXITING 22690 1727204277.66177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204277.68402: done with get_vars() 22690 1727204277.68445: done getting variables 22690 1727204277.68503: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.185) 0:00:44.968 ***** 22690 1727204277.68539: entering _queue_task() for managed-node2/debug 22690 1727204277.68936: worker is 1 (out of 1 available) 22690 1727204277.68951: exiting _queue_task() for managed-node2/debug 22690 1727204277.69083: done queuing things up, now waiting for results queue to drain 22690 1727204277.69085: waiting for pending results... 22690 1727204277.69302: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22690 1727204277.69450: in run() - task 127b8e07-fff9-78bb-bf56-000000000070 22690 1727204277.69478: variable 'ansible_search_path' from source: unknown 22690 1727204277.69486: variable 'ansible_search_path' from source: unknown 22690 1727204277.69537: calling self._execute() 22690 1727204277.69681: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.69690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.69893: variable 'omit' from source: magic vars 22690 1727204277.71175: variable 'ansible_distribution_major_version' from source: facts 22690 1727204277.71180: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204277.71536: variable 'network_state' from source: role '' defaults 22690 1727204277.71542: Evaluated conditional (network_state != {}): False 22690 1727204277.71544: when evaluation is False, skipping this task 22690 1727204277.71548: _execute() done 22690 1727204277.71551: dumping result to json 22690 1727204277.71554: done dumping result, returning 22690 1727204277.71557: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [127b8e07-fff9-78bb-bf56-000000000070] 22690 1727204277.71560: sending task result for task 127b8e07-fff9-78bb-bf56-000000000070 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22690 1727204277.72194: no more pending results, returning what we have 22690 1727204277.72198: results queue empty 22690 1727204277.72199: checking for any_errors_fatal 22690 1727204277.72210: done checking for any_errors_fatal 22690 1727204277.72211: checking for max_fail_percentage 22690 1727204277.72212: done checking for max_fail_percentage 22690 1727204277.72213: checking to see if all hosts have failed and the running result is not ok 22690 1727204277.72214: done checking to see if all hosts have failed 22690 1727204277.72215: getting the remaining hosts for this loop 22690 1727204277.72216: done getting the remaining hosts for this loop 22690 1727204277.72221: getting the next task for host managed-node2 22690 1727204277.72227: done getting next task for host managed-node2 22690 1727204277.72231: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204277.72234: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204277.72250: getting variables 22690 1727204277.72252: in VariableManager get_vars() 22690 1727204277.72298: Calling all_inventory to load vars for managed-node2 22690 1727204277.72301: Calling groups_inventory to load vars for managed-node2 22690 1727204277.72304: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204277.72314: Calling all_plugins_play to load vars for managed-node2 22690 1727204277.72317: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204277.72320: Calling groups_plugins_play to load vars for managed-node2 22690 1727204277.73027: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000070 22690 1727204277.73031: WORKER PROCESS EXITING 22690 1727204277.76244: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204277.79124: done with get_vars() 22690 1727204277.79156: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.107) 0:00:45.076 ***** 22690 1727204277.79268: entering _queue_task() for managed-node2/ping 22690 1727204277.79652: worker is 1 (out of 1 available) 22690 1727204277.79777: exiting _queue_task() for managed-node2/ping 22690 1727204277.79788: done queuing things up, now waiting for results queue to drain 22690 1727204277.79789: waiting for pending results... 22690 1727204277.80058: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22690 1727204277.80328: in run() - task 127b8e07-fff9-78bb-bf56-000000000071 22690 1727204277.80358: variable 'ansible_search_path' from source: unknown 22690 1727204277.80368: variable 'ansible_search_path' from source: unknown 22690 1727204277.80416: calling self._execute() 22690 1727204277.80554: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.80574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.80590: variable 'omit' from source: magic vars 22690 1727204277.81038: variable 'ansible_distribution_major_version' from source: facts 22690 1727204277.81057: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204277.81109: variable 'omit' from source: magic vars 22690 1727204277.81133: variable 'omit' from source: magic vars 22690 1727204277.81184: variable 'omit' from source: magic vars 22690 1727204277.81240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204277.81297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204277.81400: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204277.81404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.81406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204277.81408: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204277.81412: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.81419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.81533: Set connection var ansible_connection to ssh 22690 1727204277.81556: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204277.81572: Set connection var ansible_pipelining to False 22690 1727204277.81580: Set connection var ansible_shell_type to sh 22690 1727204277.81591: Set connection var ansible_shell_executable to /bin/sh 22690 1727204277.81606: Set connection var ansible_timeout to 10 22690 1727204277.81642: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.81656: variable 'ansible_connection' from source: unknown 22690 1727204277.81727: variable 'ansible_module_compression' from source: unknown 22690 1727204277.81730: variable 'ansible_shell_type' from source: unknown 22690 1727204277.81732: variable 'ansible_shell_executable' from source: unknown 22690 1727204277.81734: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204277.81737: variable 'ansible_pipelining' from source: unknown 22690 1727204277.81739: variable 'ansible_timeout' from source: unknown 22690 1727204277.81742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204277.81936: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204277.81962: variable 'omit' from source: magic vars 22690 1727204277.81977: starting attempt loop 22690 1727204277.81983: running the handler 22690 1727204277.82001: _low_level_execute_command(): starting 22690 1727204277.82012: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204277.83013: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204277.83072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.83149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.84918: stdout chunk (state=3): >>>/root <<< 22690 1727204277.85135: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204277.85139: stdout chunk (state=3): >>><<< 22690 1727204277.85142: stderr chunk (state=3): >>><<< 22690 1727204277.85284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204277.85289: _low_level_execute_command(): starting 22690 1727204277.85292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220 `" && echo ansible-tmp-1727204277.8517437-25684-13044535011220="` echo /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220 `" ) && sleep 0' 22690 1727204277.85906: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204277.85920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204277.85934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204277.85961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204277.86094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204277.86195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204277.86264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204277.86338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.86496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.88517: stdout chunk (state=3): >>>ansible-tmp-1727204277.8517437-25684-13044535011220=/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220 <<< 22690 1727204277.88722: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204277.88726: stdout chunk (state=3): >>><<< 22690 1727204277.88729: stderr chunk (state=3): >>><<< 22690 1727204277.88872: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204277.8517437-25684-13044535011220=/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204277.88877: variable 'ansible_module_compression' from source: unknown 22690 1727204277.88879: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22690 1727204277.88917: variable 'ansible_facts' from source: unknown 22690 1727204277.89008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py 22690 1727204277.89191: Sending initial data 22690 1727204277.89204: Sent initial data (152 bytes) 22690 1727204277.89962: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204277.89972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.90092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.91793: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204277.91798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204277.91864: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpn3mcx016 /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py <<< 22690 1727204277.91875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py" <<< 22690 1727204277.91912: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 22690 1727204277.91940: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpn3mcx016" to remote "/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py" <<< 22690 1727204277.93044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204277.93048: stdout chunk (state=3): >>><<< 22690 1727204277.93050: stderr chunk (state=3): >>><<< 22690 1727204277.93052: done transferring module to remote 22690 1727204277.93055: _low_level_execute_command(): starting 22690 1727204277.93057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/ /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py && sleep 0' 22690 1727204277.93854: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204277.93964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204277.94015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.94078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204277.96002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204277.96028: stdout chunk (state=3): >>><<< 22690 1727204277.96047: stderr chunk (state=3): >>><<< 22690 1727204277.96158: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204277.96168: _low_level_execute_command(): starting 22690 1727204277.96176: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/AnsiballZ_ping.py && sleep 0' 22690 1727204277.96784: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204277.96792: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204277.96806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204277.96840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204277.96940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.13419: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22690 1727204278.15043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204278.15048: stdout chunk (state=3): >>><<< 22690 1727204278.15050: stderr chunk (state=3): >>><<< 22690 1727204278.15053: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204278.15056: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204278.15059: _low_level_execute_command(): starting 22690 1727204278.15062: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204277.8517437-25684-13044535011220/ > /dev/null 2>&1 && sleep 0' 22690 1727204278.16268: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204278.16272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204278.16275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204278.16277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204278.16279: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204278.16487: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.16593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.16700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.18651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204278.18775: stderr chunk (state=3): >>><<< 22690 1727204278.18779: stdout chunk (state=3): >>><<< 22690 1727204278.18801: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204278.18807: handler run complete 22690 1727204278.18826: attempt loop complete, returning result 22690 1727204278.18831: _execute() done 22690 1727204278.18834: dumping result to json 22690 1727204278.18836: done dumping result, returning 22690 1727204278.18870: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [127b8e07-fff9-78bb-bf56-000000000071] 22690 1727204278.18874: sending task result for task 127b8e07-fff9-78bb-bf56-000000000071 22690 1727204278.19001: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000071 22690 1727204278.19004: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 22690 1727204278.19116: no more pending results, returning what we have 22690 1727204278.19120: results queue empty 22690 1727204278.19121: checking for any_errors_fatal 22690 1727204278.19125: done checking for any_errors_fatal 22690 1727204278.19126: checking for max_fail_percentage 22690 1727204278.19127: done checking for max_fail_percentage 22690 1727204278.19128: checking to see if all hosts have failed and the running result is not ok 22690 1727204278.19129: done checking to see if all hosts have failed 22690 1727204278.19129: getting the remaining hosts for this loop 22690 1727204278.19131: done getting the remaining hosts for this loop 22690 1727204278.19136: getting the next task for host managed-node2 22690 1727204278.19142: done getting next task for host managed-node2 22690 1727204278.19145: ^ task is: TASK: meta (role_complete) 22690 1727204278.19147: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.19156: getting variables 22690 1727204278.19158: in VariableManager get_vars() 22690 1727204278.19196: Calling all_inventory to load vars for managed-node2 22690 1727204278.19199: Calling groups_inventory to load vars for managed-node2 22690 1727204278.19201: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.19211: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.19216: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.19219: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.21480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.24012: done with get_vars() 22690 1727204278.24059: done getting variables 22690 1727204278.24164: done queuing things up, now waiting for results queue to drain 22690 1727204278.24169: results queue empty 22690 1727204278.24170: checking for any_errors_fatal 22690 1727204278.24173: done checking for any_errors_fatal 22690 1727204278.24174: checking for max_fail_percentage 22690 1727204278.24175: done checking for max_fail_percentage 22690 1727204278.24175: checking to see if all hosts have failed and the running result is not ok 22690 1727204278.24176: done checking to see if all hosts have failed 22690 1727204278.24177: getting the remaining hosts for this loop 22690 1727204278.24178: done getting the remaining hosts for this loop 22690 1727204278.24181: getting the next task for host managed-node2 22690 1727204278.24184: done getting next task for host managed-node2 22690 1727204278.24186: ^ task is: TASK: meta (flush_handlers) 22690 1727204278.24187: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.24190: getting variables 22690 1727204278.24191: in VariableManager get_vars() 22690 1727204278.24204: Calling all_inventory to load vars for managed-node2 22690 1727204278.24206: Calling groups_inventory to load vars for managed-node2 22690 1727204278.24208: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.24216: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.24218: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.24220: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.26080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.28474: done with get_vars() 22690 1727204278.28509: done getting variables 22690 1727204278.28579: in VariableManager get_vars() 22690 1727204278.28594: Calling all_inventory to load vars for managed-node2 22690 1727204278.28596: Calling groups_inventory to load vars for managed-node2 22690 1727204278.28599: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.28604: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.28607: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.28610: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.30322: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.32973: done with get_vars() 22690 1727204278.33021: done queuing things up, now waiting for results queue to drain 22690 1727204278.33024: results queue empty 22690 1727204278.33025: checking for any_errors_fatal 22690 1727204278.33026: done checking for any_errors_fatal 22690 1727204278.33027: checking for max_fail_percentage 22690 1727204278.33029: done checking for max_fail_percentage 22690 1727204278.33030: checking to see if all hosts have failed and the running result is not ok 22690 1727204278.33030: done checking to see if all hosts have failed 22690 1727204278.33031: getting the remaining hosts for this loop 22690 1727204278.33032: done getting the remaining hosts for this loop 22690 1727204278.33035: getting the next task for host managed-node2 22690 1727204278.33040: done getting next task for host managed-node2 22690 1727204278.33041: ^ task is: TASK: meta (flush_handlers) 22690 1727204278.33043: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.33046: getting variables 22690 1727204278.33047: in VariableManager get_vars() 22690 1727204278.33062: Calling all_inventory to load vars for managed-node2 22690 1727204278.33064: Calling groups_inventory to load vars for managed-node2 22690 1727204278.33068: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.33074: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.33076: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.33078: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.34794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.37334: done with get_vars() 22690 1727204278.37377: done getting variables 22690 1727204278.37502: in VariableManager get_vars() 22690 1727204278.37523: Calling all_inventory to load vars for managed-node2 22690 1727204278.37526: Calling groups_inventory to load vars for managed-node2 22690 1727204278.37528: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.37535: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.37537: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.37541: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.39394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.41641: done with get_vars() 22690 1727204278.41691: done queuing things up, now waiting for results queue to drain 22690 1727204278.41694: results queue empty 22690 1727204278.41695: checking for any_errors_fatal 22690 1727204278.41697: done checking for any_errors_fatal 22690 1727204278.41698: checking for max_fail_percentage 22690 1727204278.41699: done checking for max_fail_percentage 22690 1727204278.41700: checking to see if all hosts have failed and the running result is not ok 22690 1727204278.41701: done checking to see if all hosts have failed 22690 1727204278.41702: getting the remaining hosts for this loop 22690 1727204278.41703: done getting the remaining hosts for this loop 22690 1727204278.41711: getting the next task for host managed-node2 22690 1727204278.41719: done getting next task for host managed-node2 22690 1727204278.41720: ^ task is: None 22690 1727204278.41721: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.41723: done queuing things up, now waiting for results queue to drain 22690 1727204278.41724: results queue empty 22690 1727204278.41725: checking for any_errors_fatal 22690 1727204278.41726: done checking for any_errors_fatal 22690 1727204278.41726: checking for max_fail_percentage 22690 1727204278.41732: done checking for max_fail_percentage 22690 1727204278.41733: checking to see if all hosts have failed and the running result is not ok 22690 1727204278.41734: done checking to see if all hosts have failed 22690 1727204278.41736: getting the next task for host managed-node2 22690 1727204278.41739: done getting next task for host managed-node2 22690 1727204278.41740: ^ task is: None 22690 1727204278.41741: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.41787: in VariableManager get_vars() 22690 1727204278.41806: done with get_vars() 22690 1727204278.41811: in VariableManager get_vars() 22690 1727204278.41824: done with get_vars() 22690 1727204278.41829: variable 'omit' from source: magic vars 22690 1727204278.41876: in VariableManager get_vars() 22690 1727204278.41889: done with get_vars() 22690 1727204278.41917: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 22690 1727204278.42234: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204278.42262: getting the remaining hosts for this loop 22690 1727204278.42263: done getting the remaining hosts for this loop 22690 1727204278.42271: getting the next task for host managed-node2 22690 1727204278.42280: done getting next task for host managed-node2 22690 1727204278.42282: ^ task is: TASK: Gathering Facts 22690 1727204278.42284: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204278.42286: getting variables 22690 1727204278.42287: in VariableManager get_vars() 22690 1727204278.42297: Calling all_inventory to load vars for managed-node2 22690 1727204278.42299: Calling groups_inventory to load vars for managed-node2 22690 1727204278.42302: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204278.42309: Calling all_plugins_play to load vars for managed-node2 22690 1727204278.42312: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204278.42318: Calling groups_plugins_play to load vars for managed-node2 22690 1727204278.43988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204278.46338: done with get_vars() 22690 1727204278.46368: done getting variables 22690 1727204278.46421: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.671) 0:00:45.747 ***** 22690 1727204278.46448: entering _queue_task() for managed-node2/gather_facts 22690 1727204278.46840: worker is 1 (out of 1 available) 22690 1727204278.46852: exiting _queue_task() for managed-node2/gather_facts 22690 1727204278.46869: done queuing things up, now waiting for results queue to drain 22690 1727204278.46870: waiting for pending results... 22690 1727204278.47185: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204278.47282: in run() - task 127b8e07-fff9-78bb-bf56-0000000004e4 22690 1727204278.47377: variable 'ansible_search_path' from source: unknown 22690 1727204278.47381: calling self._execute() 22690 1727204278.47450: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204278.47463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204278.47481: variable 'omit' from source: magic vars 22690 1727204278.48022: variable 'ansible_distribution_major_version' from source: facts 22690 1727204278.48047: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204278.48061: variable 'omit' from source: magic vars 22690 1727204278.48105: variable 'omit' from source: magic vars 22690 1727204278.48173: variable 'omit' from source: magic vars 22690 1727204278.48227: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204278.48286: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204278.48351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204278.48357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204278.48376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204278.48419: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204278.48460: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204278.48463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204278.48561: Set connection var ansible_connection to ssh 22690 1727204278.48589: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204278.48603: Set connection var ansible_pipelining to False 22690 1727204278.48673: Set connection var ansible_shell_type to sh 22690 1727204278.48677: Set connection var ansible_shell_executable to /bin/sh 22690 1727204278.48680: Set connection var ansible_timeout to 10 22690 1727204278.48684: variable 'ansible_shell_executable' from source: unknown 22690 1727204278.48687: variable 'ansible_connection' from source: unknown 22690 1727204278.48689: variable 'ansible_module_compression' from source: unknown 22690 1727204278.48692: variable 'ansible_shell_type' from source: unknown 22690 1727204278.48694: variable 'ansible_shell_executable' from source: unknown 22690 1727204278.48696: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204278.48771: variable 'ansible_pipelining' from source: unknown 22690 1727204278.48775: variable 'ansible_timeout' from source: unknown 22690 1727204278.48780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204278.48957: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204278.49005: variable 'omit' from source: magic vars 22690 1727204278.49009: starting attempt loop 22690 1727204278.49012: running the handler 22690 1727204278.49031: variable 'ansible_facts' from source: unknown 22690 1727204278.49059: _low_level_execute_command(): starting 22690 1727204278.49117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204278.50012: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204278.50073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204278.50102: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.50150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.50228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.52023: stdout chunk (state=3): >>>/root <<< 22690 1727204278.52228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204278.52232: stdout chunk (state=3): >>><<< 22690 1727204278.52234: stderr chunk (state=3): >>><<< 22690 1727204278.52257: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204278.52279: _low_level_execute_command(): starting 22690 1727204278.52362: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840 `" && echo ansible-tmp-1727204278.5226278-25715-18377432889840="` echo /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840 `" ) && sleep 0' 22690 1727204278.53048: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.53085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.53200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.55192: stdout chunk (state=3): >>>ansible-tmp-1727204278.5226278-25715-18377432889840=/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840 <<< 22690 1727204278.55421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204278.55425: stdout chunk (state=3): >>><<< 22690 1727204278.55428: stderr chunk (state=3): >>><<< 22690 1727204278.55589: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204278.5226278-25715-18377432889840=/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204278.55593: variable 'ansible_module_compression' from source: unknown 22690 1727204278.55597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204278.55668: variable 'ansible_facts' from source: unknown 22690 1727204278.55927: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py 22690 1727204278.56459: Sending initial data 22690 1727204278.56462: Sent initial data (153 bytes) 22690 1727204278.57122: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204278.57125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204278.57129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204278.57136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204278.57142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204278.57184: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.57213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.57316: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.58992: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204278.59075: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204278.59162: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpd3dibk6t /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py <<< 22690 1727204278.59168: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py" <<< 22690 1727204278.59238: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpd3dibk6t" to remote "/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py" <<< 22690 1727204278.61079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204278.61084: stdout chunk (state=3): >>><<< 22690 1727204278.61086: stderr chunk (state=3): >>><<< 22690 1727204278.61089: done transferring module to remote 22690 1727204278.61091: _low_level_execute_command(): starting 22690 1727204278.61094: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/ /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py && sleep 0' 22690 1727204278.62159: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204278.62164: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204278.62399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.62718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.62722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204278.64526: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204278.64605: stderr chunk (state=3): >>><<< 22690 1727204278.64620: stdout chunk (state=3): >>><<< 22690 1727204278.64773: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204278.64782: _low_level_execute_command(): starting 22690 1727204278.64785: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/AnsiballZ_setup.py && sleep 0' 22690 1727204278.66270: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204278.66473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204278.66489: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204278.66511: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204278.66698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204279.32045: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5625, "5m": 0.52099609375, "15m": 0.2890625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "3<<< 22690 1727204279.32077: stdout chunk (state=3): >>>9", "day": "24", "hour": "14", "minute": "57", "second": "58", "epoch": "1727204278", "epoch_int": "1727204278", "date": "2024-09-24", "time": "14:57:58", "iso8601_micro": "2024-09-24T18:57:58.967649Z", "iso8601": "2024-09-24T18:57:58Z", "iso8601_basic": "20240924T145758967649", "iso8601_basic_short": "20240924T145758", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3047, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 669, "free": 3047}, "nocache": {"free": 3479, "used": 237}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 625, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316502528, "block_size": 4096, "block_total": 64479564, "block_available": 61356568, "block_used": 3122996, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204279.34292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204279.34398: stderr chunk (state=3): >>><<< 22690 1727204279.34402: stdout chunk (state=3): >>><<< 22690 1727204279.34675: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.5625, "5m": 0.52099609375, "15m": 0.2890625}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "58", "epoch": "1727204278", "epoch_int": "1727204278", "date": "2024-09-24", "time": "14:57:58", "iso8601_micro": "2024-09-24T18:57:58.967649Z", "iso8601": "2024-09-24T18:57:58Z", "iso8601_basic": "20240924T145758967649", "iso8601_basic_short": "20240924T145758", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3047, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 669, "free": 3047}, "nocache": {"free": 3479, "used": 237}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 625, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316502528, "block_size": 4096, "block_total": 64479564, "block_available": 61356568, "block_used": 3122996, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204279.35262: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204279.35304: _low_level_execute_command(): starting 22690 1727204279.35316: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204278.5226278-25715-18377432889840/ > /dev/null 2>&1 && sleep 0' 22690 1727204279.36839: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204279.36946: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204279.36976: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204279.37075: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204279.39192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204279.39408: stderr chunk (state=3): >>><<< 22690 1727204279.39412: stdout chunk (state=3): >>><<< 22690 1727204279.39431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204279.39447: handler run complete 22690 1727204279.39702: variable 'ansible_facts' from source: unknown 22690 1727204279.39991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.40636: variable 'ansible_facts' from source: unknown 22690 1727204279.41172: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.41177: attempt loop complete, returning result 22690 1727204279.41180: _execute() done 22690 1727204279.41182: dumping result to json 22690 1727204279.41185: done dumping result, returning 22690 1727204279.41187: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-0000000004e4] 22690 1727204279.41189: sending task result for task 127b8e07-fff9-78bb-bf56-0000000004e4 ok: [managed-node2] 22690 1727204279.42569: no more pending results, returning what we have 22690 1727204279.42573: results queue empty 22690 1727204279.42575: checking for any_errors_fatal 22690 1727204279.42576: done checking for any_errors_fatal 22690 1727204279.42577: checking for max_fail_percentage 22690 1727204279.42579: done checking for max_fail_percentage 22690 1727204279.42580: checking to see if all hosts have failed and the running result is not ok 22690 1727204279.42581: done checking to see if all hosts have failed 22690 1727204279.42582: getting the remaining hosts for this loop 22690 1727204279.42583: done getting the remaining hosts for this loop 22690 1727204279.42588: getting the next task for host managed-node2 22690 1727204279.42593: done getting next task for host managed-node2 22690 1727204279.42595: ^ task is: TASK: meta (flush_handlers) 22690 1727204279.42597: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204279.42601: getting variables 22690 1727204279.42603: in VariableManager get_vars() 22690 1727204279.42631: Calling all_inventory to load vars for managed-node2 22690 1727204279.42634: Calling groups_inventory to load vars for managed-node2 22690 1727204279.42637: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.43080: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.43085: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.43089: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.43786: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000004e4 22690 1727204279.43791: WORKER PROCESS EXITING 22690 1727204279.45892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.48192: done with get_vars() 22690 1727204279.48232: done getting variables 22690 1727204279.48321: in VariableManager get_vars() 22690 1727204279.48333: Calling all_inventory to load vars for managed-node2 22690 1727204279.48336: Calling groups_inventory to load vars for managed-node2 22690 1727204279.48339: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.48344: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.48346: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.48349: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.50046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.53022: done with get_vars() 22690 1727204279.53068: done queuing things up, now waiting for results queue to drain 22690 1727204279.53070: results queue empty 22690 1727204279.53071: checking for any_errors_fatal 22690 1727204279.53076: done checking for any_errors_fatal 22690 1727204279.53077: checking for max_fail_percentage 22690 1727204279.53078: done checking for max_fail_percentage 22690 1727204279.53079: checking to see if all hosts have failed and the running result is not ok 22690 1727204279.53079: done checking to see if all hosts have failed 22690 1727204279.53086: getting the remaining hosts for this loop 22690 1727204279.53087: done getting the remaining hosts for this loop 22690 1727204279.53090: getting the next task for host managed-node2 22690 1727204279.53095: done getting next task for host managed-node2 22690 1727204279.53098: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 22690 1727204279.53099: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204279.53102: getting variables 22690 1727204279.53103: in VariableManager get_vars() 22690 1727204279.53113: Calling all_inventory to load vars for managed-node2 22690 1727204279.53116: Calling groups_inventory to load vars for managed-node2 22690 1727204279.53118: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.53125: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.53127: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.53130: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.54671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.56806: done with get_vars() 22690 1727204279.56842: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Tuesday 24 September 2024 14:57:59 -0400 (0:00:01.104) 0:00:46.852 ***** 22690 1727204279.56931: entering _queue_task() for managed-node2/include_tasks 22690 1727204279.57408: worker is 1 (out of 1 available) 22690 1727204279.57423: exiting _queue_task() for managed-node2/include_tasks 22690 1727204279.57437: done queuing things up, now waiting for results queue to drain 22690 1727204279.57438: waiting for pending results... 22690 1727204279.57746: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' 22690 1727204279.58274: in run() - task 127b8e07-fff9-78bb-bf56-000000000074 22690 1727204279.58279: variable 'ansible_search_path' from source: unknown 22690 1727204279.58282: calling self._execute() 22690 1727204279.58285: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204279.58288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204279.58291: variable 'omit' from source: magic vars 22690 1727204279.59222: variable 'ansible_distribution_major_version' from source: facts 22690 1727204279.59247: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204279.59285: _execute() done 22690 1727204279.59295: dumping result to json 22690 1727204279.59381: done dumping result, returning 22690 1727204279.59397: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' [127b8e07-fff9-78bb-bf56-000000000074] 22690 1727204279.59409: sending task result for task 127b8e07-fff9-78bb-bf56-000000000074 22690 1727204279.59772: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000074 22690 1727204279.59776: WORKER PROCESS EXITING 22690 1727204279.59812: no more pending results, returning what we have 22690 1727204279.59818: in VariableManager get_vars() 22690 1727204279.59864: Calling all_inventory to load vars for managed-node2 22690 1727204279.59869: Calling groups_inventory to load vars for managed-node2 22690 1727204279.59874: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.59892: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.59897: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.59901: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.63547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.68385: done with get_vars() 22690 1727204279.68430: variable 'ansible_search_path' from source: unknown 22690 1727204279.68449: we have included files to process 22690 1727204279.68450: generating all_blocks data 22690 1727204279.68452: done generating all_blocks data 22690 1727204279.68453: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22690 1727204279.68454: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22690 1727204279.68457: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22690 1727204279.68860: in VariableManager get_vars() 22690 1727204279.69006: done with get_vars() 22690 1727204279.69147: done processing included file 22690 1727204279.69150: iterating over new_blocks loaded from include file 22690 1727204279.69151: in VariableManager get_vars() 22690 1727204279.69171: done with get_vars() 22690 1727204279.69173: filtering new block on tags 22690 1727204279.69193: done filtering new block on tags 22690 1727204279.69196: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 22690 1727204279.69203: extending task lists for all hosts with included blocks 22690 1727204279.69261: done extending task lists 22690 1727204279.69262: done processing included files 22690 1727204279.69263: results queue empty 22690 1727204279.69264: checking for any_errors_fatal 22690 1727204279.69268: done checking for any_errors_fatal 22690 1727204279.69269: checking for max_fail_percentage 22690 1727204279.69270: done checking for max_fail_percentage 22690 1727204279.69271: checking to see if all hosts have failed and the running result is not ok 22690 1727204279.69272: done checking to see if all hosts have failed 22690 1727204279.69273: getting the remaining hosts for this loop 22690 1727204279.69274: done getting the remaining hosts for this loop 22690 1727204279.69281: getting the next task for host managed-node2 22690 1727204279.69286: done getting next task for host managed-node2 22690 1727204279.69288: ^ task is: TASK: Include the task 'get_profile_stat.yml' 22690 1727204279.69291: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204279.69294: getting variables 22690 1727204279.69295: in VariableManager get_vars() 22690 1727204279.69306: Calling all_inventory to load vars for managed-node2 22690 1727204279.69308: Calling groups_inventory to load vars for managed-node2 22690 1727204279.69311: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.69320: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.69323: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.69326: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.71228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.74960: done with get_vars() 22690 1727204279.75209: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.183) 0:00:47.036 ***** 22690 1727204279.75309: entering _queue_task() for managed-node2/include_tasks 22690 1727204279.76136: worker is 1 (out of 1 available) 22690 1727204279.76153: exiting _queue_task() for managed-node2/include_tasks 22690 1727204279.76577: done queuing things up, now waiting for results queue to drain 22690 1727204279.76579: waiting for pending results... 22690 1727204279.76637: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 22690 1727204279.77076: in run() - task 127b8e07-fff9-78bb-bf56-0000000004f5 22690 1727204279.77081: variable 'ansible_search_path' from source: unknown 22690 1727204279.77085: variable 'ansible_search_path' from source: unknown 22690 1727204279.77088: calling self._execute() 22690 1727204279.77318: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204279.77385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204279.77404: variable 'omit' from source: magic vars 22690 1727204279.78259: variable 'ansible_distribution_major_version' from source: facts 22690 1727204279.78575: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204279.78580: _execute() done 22690 1727204279.78583: dumping result to json 22690 1727204279.78586: done dumping result, returning 22690 1727204279.78589: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [127b8e07-fff9-78bb-bf56-0000000004f5] 22690 1727204279.78591: sending task result for task 127b8e07-fff9-78bb-bf56-0000000004f5 22690 1727204279.78884: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000004f5 22690 1727204279.78889: WORKER PROCESS EXITING 22690 1727204279.78925: no more pending results, returning what we have 22690 1727204279.78932: in VariableManager get_vars() 22690 1727204279.78974: Calling all_inventory to load vars for managed-node2 22690 1727204279.78979: Calling groups_inventory to load vars for managed-node2 22690 1727204279.78983: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.79000: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.79004: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.79007: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.82783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.85091: done with get_vars() 22690 1727204279.85129: variable 'ansible_search_path' from source: unknown 22690 1727204279.85130: variable 'ansible_search_path' from source: unknown 22690 1727204279.85178: we have included files to process 22690 1727204279.85180: generating all_blocks data 22690 1727204279.85181: done generating all_blocks data 22690 1727204279.85183: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22690 1727204279.85184: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22690 1727204279.85187: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22690 1727204279.86337: done processing included file 22690 1727204279.86340: iterating over new_blocks loaded from include file 22690 1727204279.86341: in VariableManager get_vars() 22690 1727204279.86357: done with get_vars() 22690 1727204279.86359: filtering new block on tags 22690 1727204279.86388: done filtering new block on tags 22690 1727204279.86392: in VariableManager get_vars() 22690 1727204279.86408: done with get_vars() 22690 1727204279.86409: filtering new block on tags 22690 1727204279.86432: done filtering new block on tags 22690 1727204279.86435: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 22690 1727204279.86441: extending task lists for all hosts with included blocks 22690 1727204279.86552: done extending task lists 22690 1727204279.86554: done processing included files 22690 1727204279.86555: results queue empty 22690 1727204279.86555: checking for any_errors_fatal 22690 1727204279.86559: done checking for any_errors_fatal 22690 1727204279.86560: checking for max_fail_percentage 22690 1727204279.86561: done checking for max_fail_percentage 22690 1727204279.86562: checking to see if all hosts have failed and the running result is not ok 22690 1727204279.86563: done checking to see if all hosts have failed 22690 1727204279.86564: getting the remaining hosts for this loop 22690 1727204279.86568: done getting the remaining hosts for this loop 22690 1727204279.86571: getting the next task for host managed-node2 22690 1727204279.86576: done getting next task for host managed-node2 22690 1727204279.86578: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 22690 1727204279.86581: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204279.86583: getting variables 22690 1727204279.86584: in VariableManager get_vars() 22690 1727204279.86683: Calling all_inventory to load vars for managed-node2 22690 1727204279.86687: Calling groups_inventory to load vars for managed-node2 22690 1727204279.86690: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.86696: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.86698: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.86702: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.88182: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.90398: done with get_vars() 22690 1727204279.90427: done getting variables 22690 1727204279.90480: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.152) 0:00:47.188 ***** 22690 1727204279.90513: entering _queue_task() for managed-node2/set_fact 22690 1727204279.90899: worker is 1 (out of 1 available) 22690 1727204279.90913: exiting _queue_task() for managed-node2/set_fact 22690 1727204279.90928: done queuing things up, now waiting for results queue to drain 22690 1727204279.90930: waiting for pending results... 22690 1727204279.91446: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 22690 1727204279.91807: in run() - task 127b8e07-fff9-78bb-bf56-000000000502 22690 1727204279.91916: variable 'ansible_search_path' from source: unknown 22690 1727204279.91922: variable 'ansible_search_path' from source: unknown 22690 1727204279.91926: calling self._execute() 22690 1727204279.92113: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204279.92144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204279.92185: variable 'omit' from source: magic vars 22690 1727204279.93087: variable 'ansible_distribution_major_version' from source: facts 22690 1727204279.93250: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204279.93267: variable 'omit' from source: magic vars 22690 1727204279.93335: variable 'omit' from source: magic vars 22690 1727204279.93562: variable 'omit' from source: magic vars 22690 1727204279.93568: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204279.93702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204279.93734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204279.93805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204279.93827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204279.94001: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204279.94005: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204279.94008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204279.94273: Set connection var ansible_connection to ssh 22690 1727204279.94277: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204279.94279: Set connection var ansible_pipelining to False 22690 1727204279.94281: Set connection var ansible_shell_type to sh 22690 1727204279.94282: Set connection var ansible_shell_executable to /bin/sh 22690 1727204279.94284: Set connection var ansible_timeout to 10 22690 1727204279.94286: variable 'ansible_shell_executable' from source: unknown 22690 1727204279.94288: variable 'ansible_connection' from source: unknown 22690 1727204279.94290: variable 'ansible_module_compression' from source: unknown 22690 1727204279.94292: variable 'ansible_shell_type' from source: unknown 22690 1727204279.94294: variable 'ansible_shell_executable' from source: unknown 22690 1727204279.94385: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204279.94399: variable 'ansible_pipelining' from source: unknown 22690 1727204279.94408: variable 'ansible_timeout' from source: unknown 22690 1727204279.94417: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204279.94873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204279.94877: variable 'omit' from source: magic vars 22690 1727204279.94880: starting attempt loop 22690 1727204279.94882: running the handler 22690 1727204279.94894: handler run complete 22690 1727204279.94908: attempt loop complete, returning result 22690 1727204279.94914: _execute() done 22690 1727204279.94925: dumping result to json 22690 1727204279.94933: done dumping result, returning 22690 1727204279.94945: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [127b8e07-fff9-78bb-bf56-000000000502] 22690 1727204279.94954: sending task result for task 127b8e07-fff9-78bb-bf56-000000000502 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 22690 1727204279.95334: no more pending results, returning what we have 22690 1727204279.95338: results queue empty 22690 1727204279.95340: checking for any_errors_fatal 22690 1727204279.95342: done checking for any_errors_fatal 22690 1727204279.95343: checking for max_fail_percentage 22690 1727204279.95345: done checking for max_fail_percentage 22690 1727204279.95346: checking to see if all hosts have failed and the running result is not ok 22690 1727204279.95347: done checking to see if all hosts have failed 22690 1727204279.95347: getting the remaining hosts for this loop 22690 1727204279.95349: done getting the remaining hosts for this loop 22690 1727204279.95355: getting the next task for host managed-node2 22690 1727204279.95363: done getting next task for host managed-node2 22690 1727204279.95369: ^ task is: TASK: Stat profile file 22690 1727204279.95374: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204279.95379: getting variables 22690 1727204279.95382: in VariableManager get_vars() 22690 1727204279.95418: Calling all_inventory to load vars for managed-node2 22690 1727204279.95421: Calling groups_inventory to load vars for managed-node2 22690 1727204279.95425: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204279.95441: Calling all_plugins_play to load vars for managed-node2 22690 1727204279.95444: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204279.95447: Calling groups_plugins_play to load vars for managed-node2 22690 1727204279.96132: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000502 22690 1727204279.96136: WORKER PROCESS EXITING 22690 1727204279.97432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204279.99627: done with get_vars() 22690 1727204279.99670: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.092) 0:00:47.281 ***** 22690 1727204279.99785: entering _queue_task() for managed-node2/stat 22690 1727204280.00221: worker is 1 (out of 1 available) 22690 1727204280.00233: exiting _queue_task() for managed-node2/stat 22690 1727204280.00249: done queuing things up, now waiting for results queue to drain 22690 1727204280.00251: waiting for pending results... 22690 1727204280.00516: running TaskExecutor() for managed-node2/TASK: Stat profile file 22690 1727204280.00661: in run() - task 127b8e07-fff9-78bb-bf56-000000000503 22690 1727204280.00692: variable 'ansible_search_path' from source: unknown 22690 1727204280.00701: variable 'ansible_search_path' from source: unknown 22690 1727204280.00753: calling self._execute() 22690 1727204280.00866: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.00882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.00898: variable 'omit' from source: magic vars 22690 1727204280.01350: variable 'ansible_distribution_major_version' from source: facts 22690 1727204280.01375: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204280.01385: variable 'omit' from source: magic vars 22690 1727204280.01442: variable 'omit' from source: magic vars 22690 1727204280.01557: variable 'profile' from source: include params 22690 1727204280.01569: variable 'interface' from source: set_fact 22690 1727204280.01659: variable 'interface' from source: set_fact 22690 1727204280.01690: variable 'omit' from source: magic vars 22690 1727204280.01747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204280.01794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204280.01823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204280.01854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204280.01877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204280.01915: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204280.01924: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.01932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.02037: Set connection var ansible_connection to ssh 22690 1727204280.02057: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204280.02072: Set connection var ansible_pipelining to False 22690 1727204280.02079: Set connection var ansible_shell_type to sh 22690 1727204280.02088: Set connection var ansible_shell_executable to /bin/sh 22690 1727204280.02100: Set connection var ansible_timeout to 10 22690 1727204280.02130: variable 'ansible_shell_executable' from source: unknown 22690 1727204280.02138: variable 'ansible_connection' from source: unknown 22690 1727204280.02158: variable 'ansible_module_compression' from source: unknown 22690 1727204280.02161: variable 'ansible_shell_type' from source: unknown 22690 1727204280.02164: variable 'ansible_shell_executable' from source: unknown 22690 1727204280.02167: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.02278: variable 'ansible_pipelining' from source: unknown 22690 1727204280.02282: variable 'ansible_timeout' from source: unknown 22690 1727204280.02285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.02427: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204280.02445: variable 'omit' from source: magic vars 22690 1727204280.02457: starting attempt loop 22690 1727204280.02464: running the handler 22690 1727204280.02484: _low_level_execute_command(): starting 22690 1727204280.02496: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204280.03261: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204280.03398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.03426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.03544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.05321: stdout chunk (state=3): >>>/root <<< 22690 1727204280.05597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.05899: stderr chunk (state=3): >>><<< 22690 1727204280.05903: stdout chunk (state=3): >>><<< 22690 1727204280.05907: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.05910: _low_level_execute_command(): starting 22690 1727204280.05913: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277 `" && echo ansible-tmp-1727204280.05836-25779-13607303099277="` echo /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277 `" ) && sleep 0' 22690 1727204280.07291: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204280.07486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.07687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.07954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.09974: stdout chunk (state=3): >>>ansible-tmp-1727204280.05836-25779-13607303099277=/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277 <<< 22690 1727204280.10085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.10250: stderr chunk (state=3): >>><<< 22690 1727204280.10253: stdout chunk (state=3): >>><<< 22690 1727204280.10256: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204280.05836-25779-13607303099277=/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.10263: variable 'ansible_module_compression' from source: unknown 22690 1727204280.10331: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22690 1727204280.10480: variable 'ansible_facts' from source: unknown 22690 1727204280.10548: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py 22690 1727204280.11027: Sending initial data 22690 1727204280.11031: Sent initial data (150 bytes) 22690 1727204280.12434: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.12440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.12508: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.12527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.12631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.14326: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22690 1727204280.14341: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204280.14441: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204280.14531: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpes32gko6 /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py <<< 22690 1727204280.14535: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py" <<< 22690 1727204280.14610: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpes32gko6" to remote "/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py" <<< 22690 1727204280.16617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.16785: stderr chunk (state=3): >>><<< 22690 1727204280.16792: stdout chunk (state=3): >>><<< 22690 1727204280.16794: done transferring module to remote 22690 1727204280.16897: _low_level_execute_command(): starting 22690 1727204280.16904: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/ /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py && sleep 0' 22690 1727204280.17606: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204280.17610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.17678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204280.17696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.17772: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.17779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.17782: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.17896: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.20005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.20032: stderr chunk (state=3): >>><<< 22690 1727204280.20035: stdout chunk (state=3): >>><<< 22690 1727204280.20310: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.20320: _low_level_execute_command(): starting 22690 1727204280.20323: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/AnsiballZ_stat.py && sleep 0' 22690 1727204280.21115: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204280.21190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.21207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.21219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.21342: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.37923: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22690 1727204280.39399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204280.39510: stderr chunk (state=3): >>><<< 22690 1727204280.39520: stdout chunk (state=3): >>><<< 22690 1727204280.39542: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204280.39600: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204280.39604: _low_level_execute_command(): starting 22690 1727204280.39673: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204280.05836-25779-13607303099277/ > /dev/null 2>&1 && sleep 0' 22690 1727204280.40491: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.40523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.40632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.42773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.42778: stdout chunk (state=3): >>><<< 22690 1727204280.42781: stderr chunk (state=3): >>><<< 22690 1727204280.42784: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.42787: handler run complete 22690 1727204280.42789: attempt loop complete, returning result 22690 1727204280.42791: _execute() done 22690 1727204280.42798: dumping result to json 22690 1727204280.42801: done dumping result, returning 22690 1727204280.42803: done running TaskExecutor() for managed-node2/TASK: Stat profile file [127b8e07-fff9-78bb-bf56-000000000503] 22690 1727204280.42805: sending task result for task 127b8e07-fff9-78bb-bf56-000000000503 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22690 1727204280.43143: no more pending results, returning what we have 22690 1727204280.43148: results queue empty 22690 1727204280.43149: checking for any_errors_fatal 22690 1727204280.43159: done checking for any_errors_fatal 22690 1727204280.43160: checking for max_fail_percentage 22690 1727204280.43162: done checking for max_fail_percentage 22690 1727204280.43162: checking to see if all hosts have failed and the running result is not ok 22690 1727204280.43164: done checking to see if all hosts have failed 22690 1727204280.43164: getting the remaining hosts for this loop 22690 1727204280.43168: done getting the remaining hosts for this loop 22690 1727204280.43173: getting the next task for host managed-node2 22690 1727204280.43184: done getting next task for host managed-node2 22690 1727204280.43187: ^ task is: TASK: Set NM profile exist flag based on the profile files 22690 1727204280.43191: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204280.43198: getting variables 22690 1727204280.43200: in VariableManager get_vars() 22690 1727204280.43236: Calling all_inventory to load vars for managed-node2 22690 1727204280.43239: Calling groups_inventory to load vars for managed-node2 22690 1727204280.43242: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204280.43257: Calling all_plugins_play to load vars for managed-node2 22690 1727204280.43260: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204280.43263: Calling groups_plugins_play to load vars for managed-node2 22690 1727204280.43392: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000503 22690 1727204280.43395: WORKER PROCESS EXITING 22690 1727204280.53526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204280.58068: done with get_vars() 22690 1727204280.58113: done getting variables 22690 1727204280.58389: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.586) 0:00:47.867 ***** 22690 1727204280.58427: entering _queue_task() for managed-node2/set_fact 22690 1727204280.59243: worker is 1 (out of 1 available) 22690 1727204280.59259: exiting _queue_task() for managed-node2/set_fact 22690 1727204280.59278: done queuing things up, now waiting for results queue to drain 22690 1727204280.59280: waiting for pending results... 22690 1727204280.59790: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 22690 1727204280.60140: in run() - task 127b8e07-fff9-78bb-bf56-000000000504 22690 1727204280.60193: variable 'ansible_search_path' from source: unknown 22690 1727204280.60473: variable 'ansible_search_path' from source: unknown 22690 1727204280.60478: calling self._execute() 22690 1727204280.60612: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.60621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.60635: variable 'omit' from source: magic vars 22690 1727204280.61231: variable 'ansible_distribution_major_version' from source: facts 22690 1727204280.61236: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204280.61350: variable 'profile_stat' from source: set_fact 22690 1727204280.61401: Evaluated conditional (profile_stat.stat.exists): False 22690 1727204280.61406: when evaluation is False, skipping this task 22690 1727204280.61409: _execute() done 22690 1727204280.61412: dumping result to json 22690 1727204280.61415: done dumping result, returning 22690 1727204280.61418: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [127b8e07-fff9-78bb-bf56-000000000504] 22690 1727204280.61421: sending task result for task 127b8e07-fff9-78bb-bf56-000000000504 22690 1727204280.61641: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000504 22690 1727204280.61645: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22690 1727204280.61721: no more pending results, returning what we have 22690 1727204280.61726: results queue empty 22690 1727204280.61728: checking for any_errors_fatal 22690 1727204280.61737: done checking for any_errors_fatal 22690 1727204280.61738: checking for max_fail_percentage 22690 1727204280.61740: done checking for max_fail_percentage 22690 1727204280.61741: checking to see if all hosts have failed and the running result is not ok 22690 1727204280.61742: done checking to see if all hosts have failed 22690 1727204280.61743: getting the remaining hosts for this loop 22690 1727204280.61745: done getting the remaining hosts for this loop 22690 1727204280.61749: getting the next task for host managed-node2 22690 1727204280.61756: done getting next task for host managed-node2 22690 1727204280.61760: ^ task is: TASK: Get NM profile info 22690 1727204280.61767: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204280.61770: getting variables 22690 1727204280.61773: in VariableManager get_vars() 22690 1727204280.61811: Calling all_inventory to load vars for managed-node2 22690 1727204280.61817: Calling groups_inventory to load vars for managed-node2 22690 1727204280.61821: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204280.61836: Calling all_plugins_play to load vars for managed-node2 22690 1727204280.61840: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204280.61843: Calling groups_plugins_play to load vars for managed-node2 22690 1727204280.64751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204280.67501: done with get_vars() 22690 1727204280.67541: done getting variables 22690 1727204280.67649: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.092) 0:00:47.960 ***** 22690 1727204280.67686: entering _queue_task() for managed-node2/shell 22690 1727204280.67688: Creating lock for shell 22690 1727204280.68092: worker is 1 (out of 1 available) 22690 1727204280.68107: exiting _queue_task() for managed-node2/shell 22690 1727204280.68124: done queuing things up, now waiting for results queue to drain 22690 1727204280.68125: waiting for pending results... 22690 1727204280.68592: running TaskExecutor() for managed-node2/TASK: Get NM profile info 22690 1727204280.68664: in run() - task 127b8e07-fff9-78bb-bf56-000000000505 22690 1727204280.68673: variable 'ansible_search_path' from source: unknown 22690 1727204280.68677: variable 'ansible_search_path' from source: unknown 22690 1727204280.68680: calling self._execute() 22690 1727204280.68761: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.68767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.68872: variable 'omit' from source: magic vars 22690 1727204280.69233: variable 'ansible_distribution_major_version' from source: facts 22690 1727204280.69246: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204280.69253: variable 'omit' from source: magic vars 22690 1727204280.69322: variable 'omit' from source: magic vars 22690 1727204280.69445: variable 'profile' from source: include params 22690 1727204280.69451: variable 'interface' from source: set_fact 22690 1727204280.69873: variable 'interface' from source: set_fact 22690 1727204280.69878: variable 'omit' from source: magic vars 22690 1727204280.69882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204280.69890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204280.69893: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204280.69897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204280.69899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204280.69903: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204280.69906: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.69908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.69911: Set connection var ansible_connection to ssh 22690 1727204280.69913: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204280.69915: Set connection var ansible_pipelining to False 22690 1727204280.69917: Set connection var ansible_shell_type to sh 22690 1727204280.69919: Set connection var ansible_shell_executable to /bin/sh 22690 1727204280.69921: Set connection var ansible_timeout to 10 22690 1727204280.69924: variable 'ansible_shell_executable' from source: unknown 22690 1727204280.69929: variable 'ansible_connection' from source: unknown 22690 1727204280.69932: variable 'ansible_module_compression' from source: unknown 22690 1727204280.69934: variable 'ansible_shell_type' from source: unknown 22690 1727204280.69937: variable 'ansible_shell_executable' from source: unknown 22690 1727204280.69939: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204280.69944: variable 'ansible_pipelining' from source: unknown 22690 1727204280.69947: variable 'ansible_timeout' from source: unknown 22690 1727204280.69952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204280.70116: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204280.70131: variable 'omit' from source: magic vars 22690 1727204280.70137: starting attempt loop 22690 1727204280.70140: running the handler 22690 1727204280.70150: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204280.70168: _low_level_execute_command(): starting 22690 1727204280.70176: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204280.71100: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204280.71112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204280.71130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204280.71148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204280.71161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204280.71214: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.71298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.71347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.71437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.73238: stdout chunk (state=3): >>>/root <<< 22690 1727204280.73476: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.73480: stdout chunk (state=3): >>><<< 22690 1727204280.73483: stderr chunk (state=3): >>><<< 22690 1727204280.73487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.73489: _low_level_execute_command(): starting 22690 1727204280.73492: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059 `" && echo ansible-tmp-1727204280.734518-25856-56788327124059="` echo /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059 `" ) && sleep 0' 22690 1727204280.74190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.74200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.74220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.74323: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.76308: stdout chunk (state=3): >>>ansible-tmp-1727204280.734518-25856-56788327124059=/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059 <<< 22690 1727204280.76421: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.76673: stderr chunk (state=3): >>><<< 22690 1727204280.76676: stdout chunk (state=3): >>><<< 22690 1727204280.76678: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204280.734518-25856-56788327124059=/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.76680: variable 'ansible_module_compression' from source: unknown 22690 1727204280.76682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204280.76683: variable 'ansible_facts' from source: unknown 22690 1727204280.76684: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py 22690 1727204280.76822: Sending initial data 22690 1727204280.76922: Sent initial data (154 bytes) 22690 1727204280.77473: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204280.77571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.77590: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.77683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.79271: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204280.79337: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204280.79404: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpm9xed39v /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py <<< 22690 1727204280.79410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py" <<< 22690 1727204280.79477: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpm9xed39v" to remote "/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py" <<< 22690 1727204280.79479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py" <<< 22690 1727204280.80198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.80205: stderr chunk (state=3): >>><<< 22690 1727204280.80212: stdout chunk (state=3): >>><<< 22690 1727204280.80232: done transferring module to remote 22690 1727204280.80244: _low_level_execute_command(): starting 22690 1727204280.80249: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/ /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py && sleep 0' 22690 1727204280.80746: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204280.80751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found <<< 22690 1727204280.80754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22690 1727204280.80757: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204280.80764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.80812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.80821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.80890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204280.82734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204280.82805: stderr chunk (state=3): >>><<< 22690 1727204280.82809: stdout chunk (state=3): >>><<< 22690 1727204280.82827: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204280.82831: _low_level_execute_command(): starting 22690 1727204280.82834: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/AnsiballZ_command.py && sleep 0' 22690 1727204280.83582: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204280.83588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204280.83591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204280.83593: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204280.83602: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204280.83669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204281.02299: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:58:01.003928", "end": "2024-09-24 14:58:01.021426", "delta": "0:00:00.017498", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204281.03933: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. <<< 22690 1727204281.03937: stdout chunk (state=3): >>><<< 22690 1727204281.03940: stderr chunk (state=3): >>><<< 22690 1727204281.03975: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:58:01.003928", "end": "2024-09-24 14:58:01.021426", "delta": "0:00:00.017498", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.47.73 closed. 22690 1727204281.04030: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204281.04044: _low_level_execute_command(): starting 22690 1727204281.04120: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204280.734518-25856-56788327124059/ > /dev/null 2>&1 && sleep 0' 22690 1727204281.05190: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204281.05327: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204281.05582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204281.05654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204281.07690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204281.07705: stdout chunk (state=3): >>><<< 22690 1727204281.07723: stderr chunk (state=3): >>><<< 22690 1727204281.07755: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204281.07771: handler run complete 22690 1727204281.07803: Evaluated conditional (False): False 22690 1727204281.07839: attempt loop complete, returning result 22690 1727204281.07842: _execute() done 22690 1727204281.07845: dumping result to json 22690 1727204281.07928: done dumping result, returning 22690 1727204281.07931: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [127b8e07-fff9-78bb-bf56-000000000505] 22690 1727204281.07934: sending task result for task 127b8e07-fff9-78bb-bf56-000000000505 22690 1727204281.08023: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000505 22690 1727204281.08027: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.017498", "end": "2024-09-24 14:58:01.021426", "rc": 1, "start": "2024-09-24 14:58:01.003928" } MSG: non-zero return code ...ignoring 22690 1727204281.08117: no more pending results, returning what we have 22690 1727204281.08121: results queue empty 22690 1727204281.08122: checking for any_errors_fatal 22690 1727204281.08128: done checking for any_errors_fatal 22690 1727204281.08129: checking for max_fail_percentage 22690 1727204281.08131: done checking for max_fail_percentage 22690 1727204281.08132: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.08133: done checking to see if all hosts have failed 22690 1727204281.08134: getting the remaining hosts for this loop 22690 1727204281.08138: done getting the remaining hosts for this loop 22690 1727204281.08143: getting the next task for host managed-node2 22690 1727204281.08151: done getting next task for host managed-node2 22690 1727204281.08154: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22690 1727204281.08160: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.08164: getting variables 22690 1727204281.08167: in VariableManager get_vars() 22690 1727204281.08200: Calling all_inventory to load vars for managed-node2 22690 1727204281.08203: Calling groups_inventory to load vars for managed-node2 22690 1727204281.08207: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.08221: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.08225: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.08229: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.10501: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.14163: done with get_vars() 22690 1727204281.14209: done getting variables 22690 1727204281.14403: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.467) 0:00:48.427 ***** 22690 1727204281.14440: entering _queue_task() for managed-node2/set_fact 22690 1727204281.15303: worker is 1 (out of 1 available) 22690 1727204281.15318: exiting _queue_task() for managed-node2/set_fact 22690 1727204281.15334: done queuing things up, now waiting for results queue to drain 22690 1727204281.15335: waiting for pending results... 22690 1727204281.15797: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22690 1727204281.15873: in run() - task 127b8e07-fff9-78bb-bf56-000000000506 22690 1727204281.15882: variable 'ansible_search_path' from source: unknown 22690 1727204281.15886: variable 'ansible_search_path' from source: unknown 22690 1727204281.15896: calling self._execute() 22690 1727204281.15994: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.16001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.16008: variable 'omit' from source: magic vars 22690 1727204281.16443: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.16458: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.16606: variable 'nm_profile_exists' from source: set_fact 22690 1727204281.16628: Evaluated conditional (nm_profile_exists.rc == 0): False 22690 1727204281.16632: when evaluation is False, skipping this task 22690 1727204281.16635: _execute() done 22690 1727204281.16638: dumping result to json 22690 1727204281.16645: done dumping result, returning 22690 1727204281.16649: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [127b8e07-fff9-78bb-bf56-000000000506] 22690 1727204281.16670: sending task result for task 127b8e07-fff9-78bb-bf56-000000000506 skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 22690 1727204281.16924: no more pending results, returning what we have 22690 1727204281.16928: results queue empty 22690 1727204281.16929: checking for any_errors_fatal 22690 1727204281.16937: done checking for any_errors_fatal 22690 1727204281.16938: checking for max_fail_percentage 22690 1727204281.16940: done checking for max_fail_percentage 22690 1727204281.16940: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.16941: done checking to see if all hosts have failed 22690 1727204281.16942: getting the remaining hosts for this loop 22690 1727204281.16943: done getting the remaining hosts for this loop 22690 1727204281.16946: getting the next task for host managed-node2 22690 1727204281.16955: done getting next task for host managed-node2 22690 1727204281.16958: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 22690 1727204281.16962: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.16970: getting variables 22690 1727204281.16972: in VariableManager get_vars() 22690 1727204281.17001: Calling all_inventory to load vars for managed-node2 22690 1727204281.17003: Calling groups_inventory to load vars for managed-node2 22690 1727204281.17007: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.17018: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.17020: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.17023: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.17554: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000506 22690 1727204281.17558: WORKER PROCESS EXITING 22690 1727204281.19011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.21274: done with get_vars() 22690 1727204281.21314: done getting variables 22690 1727204281.21394: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204281.21543: variable 'profile' from source: include params 22690 1727204281.21548: variable 'interface' from source: set_fact 22690 1727204281.21624: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.072) 0:00:48.500 ***** 22690 1727204281.21670: entering _queue_task() for managed-node2/command 22690 1727204281.22300: worker is 1 (out of 1 available) 22690 1727204281.22313: exiting _queue_task() for managed-node2/command 22690 1727204281.22326: done queuing things up, now waiting for results queue to drain 22690 1727204281.22327: waiting for pending results... 22690 1727204281.22427: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 22690 1727204281.22578: in run() - task 127b8e07-fff9-78bb-bf56-000000000508 22690 1727204281.22604: variable 'ansible_search_path' from source: unknown 22690 1727204281.22613: variable 'ansible_search_path' from source: unknown 22690 1727204281.22670: calling self._execute() 22690 1727204281.22799: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.22886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.22891: variable 'omit' from source: magic vars 22690 1727204281.23283: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.23304: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.23450: variable 'profile_stat' from source: set_fact 22690 1727204281.23477: Evaluated conditional (profile_stat.stat.exists): False 22690 1727204281.23485: when evaluation is False, skipping this task 22690 1727204281.23493: _execute() done 22690 1727204281.23501: dumping result to json 22690 1727204281.23508: done dumping result, returning 22690 1727204281.23519: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 [127b8e07-fff9-78bb-bf56-000000000508] 22690 1727204281.23529: sending task result for task 127b8e07-fff9-78bb-bf56-000000000508 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22690 1727204281.23830: no more pending results, returning what we have 22690 1727204281.23834: results queue empty 22690 1727204281.23836: checking for any_errors_fatal 22690 1727204281.23844: done checking for any_errors_fatal 22690 1727204281.23845: checking for max_fail_percentage 22690 1727204281.23847: done checking for max_fail_percentage 22690 1727204281.23848: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.23849: done checking to see if all hosts have failed 22690 1727204281.23850: getting the remaining hosts for this loop 22690 1727204281.23851: done getting the remaining hosts for this loop 22690 1727204281.23857: getting the next task for host managed-node2 22690 1727204281.23872: done getting next task for host managed-node2 22690 1727204281.23876: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 22690 1727204281.23881: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.23886: getting variables 22690 1727204281.23888: in VariableManager get_vars() 22690 1727204281.23923: Calling all_inventory to load vars for managed-node2 22690 1727204281.23926: Calling groups_inventory to load vars for managed-node2 22690 1727204281.23930: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.24086: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.24090: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.24095: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.24705: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000508 22690 1727204281.24710: WORKER PROCESS EXITING 22690 1727204281.26122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.28397: done with get_vars() 22690 1727204281.28442: done getting variables 22690 1727204281.28519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204281.28655: variable 'profile' from source: include params 22690 1727204281.28660: variable 'interface' from source: set_fact 22690 1727204281.28732: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.070) 0:00:48.571 ***** 22690 1727204281.28770: entering _queue_task() for managed-node2/set_fact 22690 1727204281.29384: worker is 1 (out of 1 available) 22690 1727204281.29395: exiting _queue_task() for managed-node2/set_fact 22690 1727204281.29409: done queuing things up, now waiting for results queue to drain 22690 1727204281.29410: waiting for pending results... 22690 1727204281.29549: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 22690 1727204281.29709: in run() - task 127b8e07-fff9-78bb-bf56-000000000509 22690 1727204281.29736: variable 'ansible_search_path' from source: unknown 22690 1727204281.29751: variable 'ansible_search_path' from source: unknown 22690 1727204281.29799: calling self._execute() 22690 1727204281.29920: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.29934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.29950: variable 'omit' from source: magic vars 22690 1727204281.30390: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.30418: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.30563: variable 'profile_stat' from source: set_fact 22690 1727204281.30588: Evaluated conditional (profile_stat.stat.exists): False 22690 1727204281.30596: when evaluation is False, skipping this task 22690 1727204281.30604: _execute() done 22690 1727204281.30611: dumping result to json 22690 1727204281.30626: done dumping result, returning 22690 1727204281.30638: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [127b8e07-fff9-78bb-bf56-000000000509] 22690 1727204281.30648: sending task result for task 127b8e07-fff9-78bb-bf56-000000000509 22690 1727204281.30812: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000509 22690 1727204281.30816: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22690 1727204281.30888: no more pending results, returning what we have 22690 1727204281.30894: results queue empty 22690 1727204281.30896: checking for any_errors_fatal 22690 1727204281.30904: done checking for any_errors_fatal 22690 1727204281.30905: checking for max_fail_percentage 22690 1727204281.30907: done checking for max_fail_percentage 22690 1727204281.30908: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.30909: done checking to see if all hosts have failed 22690 1727204281.30910: getting the remaining hosts for this loop 22690 1727204281.30911: done getting the remaining hosts for this loop 22690 1727204281.30916: getting the next task for host managed-node2 22690 1727204281.30925: done getting next task for host managed-node2 22690 1727204281.30929: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 22690 1727204281.30935: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.30939: getting variables 22690 1727204281.30942: in VariableManager get_vars() 22690 1727204281.30978: Calling all_inventory to load vars for managed-node2 22690 1727204281.30981: Calling groups_inventory to load vars for managed-node2 22690 1727204281.30986: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.31003: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.31006: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.31009: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.33249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.35544: done with get_vars() 22690 1727204281.35585: done getting variables 22690 1727204281.35660: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204281.35792: variable 'profile' from source: include params 22690 1727204281.35796: variable 'interface' from source: set_fact 22690 1727204281.35864: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.071) 0:00:48.642 ***** 22690 1727204281.35900: entering _queue_task() for managed-node2/command 22690 1727204281.36484: worker is 1 (out of 1 available) 22690 1727204281.36495: exiting _queue_task() for managed-node2/command 22690 1727204281.36509: done queuing things up, now waiting for results queue to drain 22690 1727204281.36511: waiting for pending results... 22690 1727204281.36753: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 22690 1727204281.36890: in run() - task 127b8e07-fff9-78bb-bf56-00000000050a 22690 1727204281.36899: variable 'ansible_search_path' from source: unknown 22690 1727204281.36908: variable 'ansible_search_path' from source: unknown 22690 1727204281.36960: calling self._execute() 22690 1727204281.37080: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.37093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.37111: variable 'omit' from source: magic vars 22690 1727204281.37538: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.37557: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.37701: variable 'profile_stat' from source: set_fact 22690 1727204281.37830: Evaluated conditional (profile_stat.stat.exists): False 22690 1727204281.37833: when evaluation is False, skipping this task 22690 1727204281.37837: _execute() done 22690 1727204281.37840: dumping result to json 22690 1727204281.37842: done dumping result, returning 22690 1727204281.37845: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 [127b8e07-fff9-78bb-bf56-00000000050a] 22690 1727204281.37847: sending task result for task 127b8e07-fff9-78bb-bf56-00000000050a 22690 1727204281.37967: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000050a skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22690 1727204281.38030: no more pending results, returning what we have 22690 1727204281.38035: results queue empty 22690 1727204281.38036: checking for any_errors_fatal 22690 1727204281.38049: done checking for any_errors_fatal 22690 1727204281.38051: checking for max_fail_percentage 22690 1727204281.38053: done checking for max_fail_percentage 22690 1727204281.38054: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.38055: done checking to see if all hosts have failed 22690 1727204281.38056: getting the remaining hosts for this loop 22690 1727204281.38058: done getting the remaining hosts for this loop 22690 1727204281.38063: getting the next task for host managed-node2 22690 1727204281.38073: done getting next task for host managed-node2 22690 1727204281.38076: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 22690 1727204281.38081: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.38086: getting variables 22690 1727204281.38088: in VariableManager get_vars() 22690 1727204281.38125: Calling all_inventory to load vars for managed-node2 22690 1727204281.38128: Calling groups_inventory to load vars for managed-node2 22690 1727204281.38132: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.38383: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.38387: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.38391: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.38995: WORKER PROCESS EXITING 22690 1727204281.40337: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.42783: done with get_vars() 22690 1727204281.42832: done getting variables 22690 1727204281.43020: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204281.43410: variable 'profile' from source: include params 22690 1727204281.43415: variable 'interface' from source: set_fact 22690 1727204281.43628: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.077) 0:00:48.720 ***** 22690 1727204281.43670: entering _queue_task() for managed-node2/set_fact 22690 1727204281.44912: worker is 1 (out of 1 available) 22690 1727204281.44928: exiting _queue_task() for managed-node2/set_fact 22690 1727204281.45082: done queuing things up, now waiting for results queue to drain 22690 1727204281.45084: waiting for pending results... 22690 1727204281.45445: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 22690 1727204281.45634: in run() - task 127b8e07-fff9-78bb-bf56-00000000050b 22690 1727204281.45669: variable 'ansible_search_path' from source: unknown 22690 1727204281.45677: variable 'ansible_search_path' from source: unknown 22690 1727204281.45729: calling self._execute() 22690 1727204281.45848: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.45864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.45884: variable 'omit' from source: magic vars 22690 1727204281.46339: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.46367: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.46538: variable 'profile_stat' from source: set_fact 22690 1727204281.46564: Evaluated conditional (profile_stat.stat.exists): False 22690 1727204281.46617: when evaluation is False, skipping this task 22690 1727204281.46622: _execute() done 22690 1727204281.46625: dumping result to json 22690 1727204281.46628: done dumping result, returning 22690 1727204281.46640: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 [127b8e07-fff9-78bb-bf56-00000000050b] 22690 1727204281.46683: sending task result for task 127b8e07-fff9-78bb-bf56-00000000050b 22690 1727204281.46942: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000050b 22690 1727204281.46947: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22690 1727204281.47006: no more pending results, returning what we have 22690 1727204281.47011: results queue empty 22690 1727204281.47012: checking for any_errors_fatal 22690 1727204281.47018: done checking for any_errors_fatal 22690 1727204281.47019: checking for max_fail_percentage 22690 1727204281.47020: done checking for max_fail_percentage 22690 1727204281.47022: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.47023: done checking to see if all hosts have failed 22690 1727204281.47023: getting the remaining hosts for this loop 22690 1727204281.47025: done getting the remaining hosts for this loop 22690 1727204281.47030: getting the next task for host managed-node2 22690 1727204281.47041: done getting next task for host managed-node2 22690 1727204281.47075: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 22690 1727204281.47080: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.47086: getting variables 22690 1727204281.47088: in VariableManager get_vars() 22690 1727204281.47184: Calling all_inventory to load vars for managed-node2 22690 1727204281.47187: Calling groups_inventory to load vars for managed-node2 22690 1727204281.47192: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.47208: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.47212: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.47215: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.49637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.53400: done with get_vars() 22690 1727204281.53442: done getting variables 22690 1727204281.53668: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204281.53851: variable 'profile' from source: include params 22690 1727204281.53855: variable 'interface' from source: set_fact 22690 1727204281.53928: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.102) 0:00:48.823 ***** 22690 1727204281.53962: entering _queue_task() for managed-node2/assert 22690 1727204281.54786: worker is 1 (out of 1 available) 22690 1727204281.54801: exiting _queue_task() for managed-node2/assert 22690 1727204281.54815: done queuing things up, now waiting for results queue to drain 22690 1727204281.54817: waiting for pending results... 22690 1727204281.55175: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' 22690 1727204281.55500: in run() - task 127b8e07-fff9-78bb-bf56-0000000004f6 22690 1727204281.55505: variable 'ansible_search_path' from source: unknown 22690 1727204281.55508: variable 'ansible_search_path' from source: unknown 22690 1727204281.55513: calling self._execute() 22690 1727204281.55631: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.55646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.55663: variable 'omit' from source: magic vars 22690 1727204281.56237: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.56242: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.56245: variable 'omit' from source: magic vars 22690 1727204281.56343: variable 'omit' from source: magic vars 22690 1727204281.56472: variable 'profile' from source: include params 22690 1727204281.56484: variable 'interface' from source: set_fact 22690 1727204281.56569: variable 'interface' from source: set_fact 22690 1727204281.56599: variable 'omit' from source: magic vars 22690 1727204281.56717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204281.56765: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204281.56842: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204281.56870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204281.56891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204281.57159: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204281.57162: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.57167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.57271: Set connection var ansible_connection to ssh 22690 1727204281.57290: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204281.57304: Set connection var ansible_pipelining to False 22690 1727204281.57384: Set connection var ansible_shell_type to sh 22690 1727204281.57396: Set connection var ansible_shell_executable to /bin/sh 22690 1727204281.57410: Set connection var ansible_timeout to 10 22690 1727204281.57443: variable 'ansible_shell_executable' from source: unknown 22690 1727204281.57451: variable 'ansible_connection' from source: unknown 22690 1727204281.57458: variable 'ansible_module_compression' from source: unknown 22690 1727204281.57467: variable 'ansible_shell_type' from source: unknown 22690 1727204281.57487: variable 'ansible_shell_executable' from source: unknown 22690 1727204281.57490: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.57570: variable 'ansible_pipelining' from source: unknown 22690 1727204281.57574: variable 'ansible_timeout' from source: unknown 22690 1727204281.57577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.57687: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204281.57714: variable 'omit' from source: magic vars 22690 1727204281.57725: starting attempt loop 22690 1727204281.57733: running the handler 22690 1727204281.57913: variable 'lsr_net_profile_exists' from source: set_fact 22690 1727204281.58083: Evaluated conditional (not lsr_net_profile_exists): True 22690 1727204281.58095: handler run complete 22690 1727204281.58117: attempt loop complete, returning result 22690 1727204281.58125: _execute() done 22690 1727204281.58133: dumping result to json 22690 1727204281.58141: done dumping result, returning 22690 1727204281.58152: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' [127b8e07-fff9-78bb-bf56-0000000004f6] 22690 1727204281.58186: sending task result for task 127b8e07-fff9-78bb-bf56-0000000004f6 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22690 1727204281.58341: no more pending results, returning what we have 22690 1727204281.58346: results queue empty 22690 1727204281.58347: checking for any_errors_fatal 22690 1727204281.58357: done checking for any_errors_fatal 22690 1727204281.58358: checking for max_fail_percentage 22690 1727204281.58360: done checking for max_fail_percentage 22690 1727204281.58361: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.58362: done checking to see if all hosts have failed 22690 1727204281.58363: getting the remaining hosts for this loop 22690 1727204281.58367: done getting the remaining hosts for this loop 22690 1727204281.58373: getting the next task for host managed-node2 22690 1727204281.58381: done getting next task for host managed-node2 22690 1727204281.58385: ^ task is: TASK: Include the task 'assert_device_absent.yml' 22690 1727204281.58388: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.58393: getting variables 22690 1727204281.58395: in VariableManager get_vars() 22690 1727204281.58431: Calling all_inventory to load vars for managed-node2 22690 1727204281.58434: Calling groups_inventory to load vars for managed-node2 22690 1727204281.58438: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.58453: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.58456: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.58460: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.59746: done sending task result for task 127b8e07-fff9-78bb-bf56-0000000004f6 22690 1727204281.59750: WORKER PROCESS EXITING 22690 1727204281.62546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.67314: done with get_vars() 22690 1727204281.67355: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.136) 0:00:48.959 ***** 22690 1727204281.67571: entering _queue_task() for managed-node2/include_tasks 22690 1727204281.68587: worker is 1 (out of 1 available) 22690 1727204281.68601: exiting _queue_task() for managed-node2/include_tasks 22690 1727204281.68616: done queuing things up, now waiting for results queue to drain 22690 1727204281.68617: waiting for pending results... 22690 1727204281.69167: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 22690 1727204281.69377: in run() - task 127b8e07-fff9-78bb-bf56-000000000075 22690 1727204281.69396: variable 'ansible_search_path' from source: unknown 22690 1727204281.69444: calling self._execute() 22690 1727204281.69557: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.69566: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.69788: variable 'omit' from source: magic vars 22690 1727204281.70739: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.70763: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.70769: _execute() done 22690 1727204281.70772: dumping result to json 22690 1727204281.70775: done dumping result, returning 22690 1727204281.70779: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [127b8e07-fff9-78bb-bf56-000000000075] 22690 1727204281.70781: sending task result for task 127b8e07-fff9-78bb-bf56-000000000075 22690 1727204281.70963: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000075 22690 1727204281.70970: WORKER PROCESS EXITING 22690 1727204281.71099: no more pending results, returning what we have 22690 1727204281.71105: in VariableManager get_vars() 22690 1727204281.71146: Calling all_inventory to load vars for managed-node2 22690 1727204281.71149: Calling groups_inventory to load vars for managed-node2 22690 1727204281.71152: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.71170: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.71173: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.71176: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.75242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.80056: done with get_vars() 22690 1727204281.80097: variable 'ansible_search_path' from source: unknown 22690 1727204281.80115: we have included files to process 22690 1727204281.80117: generating all_blocks data 22690 1727204281.80119: done generating all_blocks data 22690 1727204281.80124: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22690 1727204281.80125: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22690 1727204281.80129: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22690 1727204281.80521: in VariableManager get_vars() 22690 1727204281.80542: done with get_vars() 22690 1727204281.80876: done processing included file 22690 1727204281.80879: iterating over new_blocks loaded from include file 22690 1727204281.80881: in VariableManager get_vars() 22690 1727204281.80895: done with get_vars() 22690 1727204281.80898: filtering new block on tags 22690 1727204281.80919: done filtering new block on tags 22690 1727204281.80922: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 22690 1727204281.80928: extending task lists for all hosts with included blocks 22690 1727204281.81294: done extending task lists 22690 1727204281.81295: done processing included files 22690 1727204281.81296: results queue empty 22690 1727204281.81297: checking for any_errors_fatal 22690 1727204281.81301: done checking for any_errors_fatal 22690 1727204281.81302: checking for max_fail_percentage 22690 1727204281.81303: done checking for max_fail_percentage 22690 1727204281.81304: checking to see if all hosts have failed and the running result is not ok 22690 1727204281.81305: done checking to see if all hosts have failed 22690 1727204281.81306: getting the remaining hosts for this loop 22690 1727204281.81307: done getting the remaining hosts for this loop 22690 1727204281.81309: getting the next task for host managed-node2 22690 1727204281.81313: done getting next task for host managed-node2 22690 1727204281.81315: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22690 1727204281.81318: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204281.81320: getting variables 22690 1727204281.81321: in VariableManager get_vars() 22690 1727204281.81331: Calling all_inventory to load vars for managed-node2 22690 1727204281.81334: Calling groups_inventory to load vars for managed-node2 22690 1727204281.81336: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.81342: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.81345: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.81348: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.84807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204281.89671: done with get_vars() 22690 1727204281.89701: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.222) 0:00:49.181 ***** 22690 1727204281.89794: entering _queue_task() for managed-node2/include_tasks 22690 1727204281.90602: worker is 1 (out of 1 available) 22690 1727204281.90619: exiting _queue_task() for managed-node2/include_tasks 22690 1727204281.90634: done queuing things up, now waiting for results queue to drain 22690 1727204281.90635: waiting for pending results... 22690 1727204281.91290: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 22690 1727204281.91443: in run() - task 127b8e07-fff9-78bb-bf56-00000000053c 22690 1727204281.91460: variable 'ansible_search_path' from source: unknown 22690 1727204281.91464: variable 'ansible_search_path' from source: unknown 22690 1727204281.91625: calling self._execute() 22690 1727204281.91885: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204281.91891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204281.91905: variable 'omit' from source: magic vars 22690 1727204281.92826: variable 'ansible_distribution_major_version' from source: facts 22690 1727204281.92838: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204281.92846: _execute() done 22690 1727204281.92849: dumping result to json 22690 1727204281.92854: done dumping result, returning 22690 1727204281.92862: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [127b8e07-fff9-78bb-bf56-00000000053c] 22690 1727204281.92912: sending task result for task 127b8e07-fff9-78bb-bf56-00000000053c 22690 1727204281.93198: no more pending results, returning what we have 22690 1727204281.93205: in VariableManager get_vars() 22690 1727204281.93244: Calling all_inventory to load vars for managed-node2 22690 1727204281.93247: Calling groups_inventory to load vars for managed-node2 22690 1727204281.93252: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204281.93271: Calling all_plugins_play to load vars for managed-node2 22690 1727204281.93275: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204281.93279: Calling groups_plugins_play to load vars for managed-node2 22690 1727204281.93885: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000053c 22690 1727204281.93889: WORKER PROCESS EXITING 22690 1727204281.96852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.01277: done with get_vars() 22690 1727204282.01313: variable 'ansible_search_path' from source: unknown 22690 1727204282.01315: variable 'ansible_search_path' from source: unknown 22690 1727204282.01360: we have included files to process 22690 1727204282.01361: generating all_blocks data 22690 1727204282.01363: done generating all_blocks data 22690 1727204282.01364: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204282.01569: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204282.01574: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22690 1727204282.01985: done processing included file 22690 1727204282.01988: iterating over new_blocks loaded from include file 22690 1727204282.01989: in VariableManager get_vars() 22690 1727204282.02006: done with get_vars() 22690 1727204282.02008: filtering new block on tags 22690 1727204282.02025: done filtering new block on tags 22690 1727204282.02028: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 22690 1727204282.02033: extending task lists for all hosts with included blocks 22690 1727204282.02144: done extending task lists 22690 1727204282.02145: done processing included files 22690 1727204282.02146: results queue empty 22690 1727204282.02147: checking for any_errors_fatal 22690 1727204282.02151: done checking for any_errors_fatal 22690 1727204282.02152: checking for max_fail_percentage 22690 1727204282.02153: done checking for max_fail_percentage 22690 1727204282.02153: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.02154: done checking to see if all hosts have failed 22690 1727204282.02155: getting the remaining hosts for this loop 22690 1727204282.02156: done getting the remaining hosts for this loop 22690 1727204282.02158: getting the next task for host managed-node2 22690 1727204282.02162: done getting next task for host managed-node2 22690 1727204282.02165: ^ task is: TASK: Get stat for interface {{ interface }} 22690 1727204282.02372: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.02375: getting variables 22690 1727204282.02376: in VariableManager get_vars() 22690 1727204282.02388: Calling all_inventory to load vars for managed-node2 22690 1727204282.02390: Calling groups_inventory to load vars for managed-node2 22690 1727204282.02393: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.02400: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.02403: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.02406: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.04955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.07150: done with get_vars() 22690 1727204282.07192: done getting variables 22690 1727204282.07383: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.176) 0:00:49.357 ***** 22690 1727204282.07418: entering _queue_task() for managed-node2/stat 22690 1727204282.08055: worker is 1 (out of 1 available) 22690 1727204282.08180: exiting _queue_task() for managed-node2/stat 22690 1727204282.08199: done queuing things up, now waiting for results queue to drain 22690 1727204282.08201: waiting for pending results... 22690 1727204282.08554: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 22690 1727204282.08764: in run() - task 127b8e07-fff9-78bb-bf56-000000000554 22690 1727204282.08794: variable 'ansible_search_path' from source: unknown 22690 1727204282.08807: variable 'ansible_search_path' from source: unknown 22690 1727204282.09068: calling self._execute() 22690 1727204282.09074: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.09077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.09082: variable 'omit' from source: magic vars 22690 1727204282.09415: variable 'ansible_distribution_major_version' from source: facts 22690 1727204282.09434: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204282.09446: variable 'omit' from source: magic vars 22690 1727204282.09504: variable 'omit' from source: magic vars 22690 1727204282.09619: variable 'interface' from source: set_fact 22690 1727204282.09647: variable 'omit' from source: magic vars 22690 1727204282.09699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204282.09748: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204282.09776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204282.09800: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.09818: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.09863: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204282.09874: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.09883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.09992: Set connection var ansible_connection to ssh 22690 1727204282.10011: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204282.10026: Set connection var ansible_pipelining to False 22690 1727204282.10033: Set connection var ansible_shell_type to sh 22690 1727204282.10044: Set connection var ansible_shell_executable to /bin/sh 22690 1727204282.10055: Set connection var ansible_timeout to 10 22690 1727204282.10091: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.10099: variable 'ansible_connection' from source: unknown 22690 1727204282.10106: variable 'ansible_module_compression' from source: unknown 22690 1727204282.10113: variable 'ansible_shell_type' from source: unknown 22690 1727204282.10120: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.10172: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.10175: variable 'ansible_pipelining' from source: unknown 22690 1727204282.10178: variable 'ansible_timeout' from source: unknown 22690 1727204282.10180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.10373: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 22690 1727204282.10394: variable 'omit' from source: magic vars 22690 1727204282.10407: starting attempt loop 22690 1727204282.10436: running the handler 22690 1727204282.10456: _low_level_execute_command(): starting 22690 1727204282.10472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204282.11891: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204282.11898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204282.12050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204282.12079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.12190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.13958: stdout chunk (state=3): >>>/root <<< 22690 1727204282.14171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.14175: stdout chunk (state=3): >>><<< 22690 1727204282.14177: stderr chunk (state=3): >>><<< 22690 1727204282.14207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204282.14231: _low_level_execute_command(): starting 22690 1727204282.14243: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080 `" && echo ansible-tmp-1727204282.142162-25984-124744578300080="` echo /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080 `" ) && sleep 0' 22690 1727204282.15752: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204282.15922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204282.15926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204282.15977: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.16045: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.18058: stdout chunk (state=3): >>>ansible-tmp-1727204282.142162-25984-124744578300080=/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080 <<< 22690 1727204282.18343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.18408: stderr chunk (state=3): >>><<< 22690 1727204282.18412: stdout chunk (state=3): >>><<< 22690 1727204282.18415: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204282.142162-25984-124744578300080=/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204282.18572: variable 'ansible_module_compression' from source: unknown 22690 1727204282.18700: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22690 1727204282.18806: variable 'ansible_facts' from source: unknown 22690 1727204282.18973: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py 22690 1727204282.19519: Sending initial data 22690 1727204282.19583: Sent initial data (152 bytes) 22690 1727204282.20870: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204282.21004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204282.21036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.21199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.22916: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204282.22982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204282.23028: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp82jpm1nh /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py <<< 22690 1727204282.23034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py" <<< 22690 1727204282.23230: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmp82jpm1nh" to remote "/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py" <<< 22690 1727204282.24573: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.24690: stderr chunk (state=3): >>><<< 22690 1727204282.24972: stdout chunk (state=3): >>><<< 22690 1727204282.24976: done transferring module to remote 22690 1727204282.24979: _low_level_execute_command(): starting 22690 1727204282.24981: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/ /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py && sleep 0' 22690 1727204282.26192: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204282.26288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204282.26483: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.26582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.28504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.28567: stderr chunk (state=3): >>><<< 22690 1727204282.28741: stdout chunk (state=3): >>><<< 22690 1727204282.28745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204282.28754: _low_level_execute_command(): starting 22690 1727204282.28756: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/AnsiballZ_stat.py && sleep 0' 22690 1727204282.30018: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204282.30025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204282.30040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204282.30072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204282.30077: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204282.30080: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204282.30083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204282.30094: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204282.30141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204282.30146: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204282.30149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204282.30151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204282.30154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204282.30488: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204282.30492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.30620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.47292: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22690 1727204282.48570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.48580: stderr chunk (state=3): >>>Shared connection to 10.31.47.73 closed. <<< 22690 1727204282.48685: stderr chunk (state=3): >>><<< 22690 1727204282.48921: stdout chunk (state=3): >>><<< 22690 1727204282.48929: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204282.48935: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204282.48947: _low_level_execute_command(): starting 22690 1727204282.48953: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204282.142162-25984-124744578300080/ > /dev/null 2>&1 && sleep 0' 22690 1727204282.50486: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204282.50491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204282.50502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204282.50509: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204282.50630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204282.50649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204282.50682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204282.50861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204282.52898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204282.53166: stderr chunk (state=3): >>><<< 22690 1727204282.53172: stdout chunk (state=3): >>><<< 22690 1727204282.53175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204282.53182: handler run complete 22690 1727204282.53185: attempt loop complete, returning result 22690 1727204282.53188: _execute() done 22690 1727204282.53200: dumping result to json 22690 1727204282.53275: done dumping result, returning 22690 1727204282.53279: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [127b8e07-fff9-78bb-bf56-000000000554] 22690 1727204282.53282: sending task result for task 127b8e07-fff9-78bb-bf56-000000000554 22690 1727204282.53545: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000554 22690 1727204282.53548: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22690 1727204282.53639: no more pending results, returning what we have 22690 1727204282.53644: results queue empty 22690 1727204282.53645: checking for any_errors_fatal 22690 1727204282.53647: done checking for any_errors_fatal 22690 1727204282.53648: checking for max_fail_percentage 22690 1727204282.53650: done checking for max_fail_percentage 22690 1727204282.53651: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.53652: done checking to see if all hosts have failed 22690 1727204282.53653: getting the remaining hosts for this loop 22690 1727204282.53654: done getting the remaining hosts for this loop 22690 1727204282.53972: getting the next task for host managed-node2 22690 1727204282.53981: done getting next task for host managed-node2 22690 1727204282.53984: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 22690 1727204282.53988: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.53993: getting variables 22690 1727204282.53995: in VariableManager get_vars() 22690 1727204282.54032: Calling all_inventory to load vars for managed-node2 22690 1727204282.54035: Calling groups_inventory to load vars for managed-node2 22690 1727204282.54039: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.54052: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.54055: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.54058: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.57248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.60040: done with get_vars() 22690 1727204282.60074: done getting variables 22690 1727204282.60181: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 22690 1727204282.60370: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.529) 0:00:49.887 ***** 22690 1727204282.60406: entering _queue_task() for managed-node2/assert 22690 1727204282.61018: worker is 1 (out of 1 available) 22690 1727204282.61032: exiting _queue_task() for managed-node2/assert 22690 1727204282.61046: done queuing things up, now waiting for results queue to drain 22690 1727204282.61047: waiting for pending results... 22690 1727204282.61410: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' 22690 1727204282.61418: in run() - task 127b8e07-fff9-78bb-bf56-00000000053d 22690 1727204282.61422: variable 'ansible_search_path' from source: unknown 22690 1727204282.61426: variable 'ansible_search_path' from source: unknown 22690 1727204282.61430: calling self._execute() 22690 1727204282.61610: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.61617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.61621: variable 'omit' from source: magic vars 22690 1727204282.62004: variable 'ansible_distribution_major_version' from source: facts 22690 1727204282.62042: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204282.62046: variable 'omit' from source: magic vars 22690 1727204282.62074: variable 'omit' from source: magic vars 22690 1727204282.62192: variable 'interface' from source: set_fact 22690 1727204282.62266: variable 'omit' from source: magic vars 22690 1727204282.62270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204282.62310: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204282.62336: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204282.62356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.62376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.62409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204282.62416: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.62420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.62570: Set connection var ansible_connection to ssh 22690 1727204282.62573: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204282.62579: Set connection var ansible_pipelining to False 22690 1727204282.62583: Set connection var ansible_shell_type to sh 22690 1727204282.62586: Set connection var ansible_shell_executable to /bin/sh 22690 1727204282.62589: Set connection var ansible_timeout to 10 22690 1727204282.62604: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.62607: variable 'ansible_connection' from source: unknown 22690 1727204282.62610: variable 'ansible_module_compression' from source: unknown 22690 1727204282.62612: variable 'ansible_shell_type' from source: unknown 22690 1727204282.62617: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.62620: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.62622: variable 'ansible_pipelining' from source: unknown 22690 1727204282.62624: variable 'ansible_timeout' from source: unknown 22690 1727204282.62672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.62802: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204282.62818: variable 'omit' from source: magic vars 22690 1727204282.62821: starting attempt loop 22690 1727204282.62824: running the handler 22690 1727204282.63021: variable 'interface_stat' from source: set_fact 22690 1727204282.63026: Evaluated conditional (not interface_stat.stat.exists): True 22690 1727204282.63029: handler run complete 22690 1727204282.63072: attempt loop complete, returning result 22690 1727204282.63075: _execute() done 22690 1727204282.63082: dumping result to json 22690 1727204282.63085: done dumping result, returning 22690 1727204282.63088: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' [127b8e07-fff9-78bb-bf56-00000000053d] 22690 1727204282.63090: sending task result for task 127b8e07-fff9-78bb-bf56-00000000053d 22690 1727204282.63163: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000053d 22690 1727204282.63168: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22690 1727204282.63320: no more pending results, returning what we have 22690 1727204282.63324: results queue empty 22690 1727204282.63325: checking for any_errors_fatal 22690 1727204282.63337: done checking for any_errors_fatal 22690 1727204282.63338: checking for max_fail_percentage 22690 1727204282.63339: done checking for max_fail_percentage 22690 1727204282.63340: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.63342: done checking to see if all hosts have failed 22690 1727204282.63343: getting the remaining hosts for this loop 22690 1727204282.63344: done getting the remaining hosts for this loop 22690 1727204282.63349: getting the next task for host managed-node2 22690 1727204282.63358: done getting next task for host managed-node2 22690 1727204282.63361: ^ task is: TASK: meta (flush_handlers) 22690 1727204282.63363: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.63370: getting variables 22690 1727204282.63372: in VariableManager get_vars() 22690 1727204282.63406: Calling all_inventory to load vars for managed-node2 22690 1727204282.63410: Calling groups_inventory to load vars for managed-node2 22690 1727204282.63415: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.63429: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.63433: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.63437: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.65802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.69015: done with get_vars() 22690 1727204282.69054: done getting variables 22690 1727204282.69139: in VariableManager get_vars() 22690 1727204282.69151: Calling all_inventory to load vars for managed-node2 22690 1727204282.69154: Calling groups_inventory to load vars for managed-node2 22690 1727204282.69157: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.69162: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.69168: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.69171: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.71027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.74439: done with get_vars() 22690 1727204282.74513: done queuing things up, now waiting for results queue to drain 22690 1727204282.74516: results queue empty 22690 1727204282.74517: checking for any_errors_fatal 22690 1727204282.74520: done checking for any_errors_fatal 22690 1727204282.74521: checking for max_fail_percentage 22690 1727204282.74522: done checking for max_fail_percentage 22690 1727204282.74523: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.74523: done checking to see if all hosts have failed 22690 1727204282.74530: getting the remaining hosts for this loop 22690 1727204282.74531: done getting the remaining hosts for this loop 22690 1727204282.74534: getting the next task for host managed-node2 22690 1727204282.74539: done getting next task for host managed-node2 22690 1727204282.74541: ^ task is: TASK: meta (flush_handlers) 22690 1727204282.74543: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.74546: getting variables 22690 1727204282.74547: in VariableManager get_vars() 22690 1727204282.74557: Calling all_inventory to load vars for managed-node2 22690 1727204282.74560: Calling groups_inventory to load vars for managed-node2 22690 1727204282.74562: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.74589: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.74592: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.74596: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.76304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.79546: done with get_vars() 22690 1727204282.79577: done getting variables 22690 1727204282.79647: in VariableManager get_vars() 22690 1727204282.79658: Calling all_inventory to load vars for managed-node2 22690 1727204282.79661: Calling groups_inventory to load vars for managed-node2 22690 1727204282.79663: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.79671: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.79674: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.79677: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.91312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.93960: done with get_vars() 22690 1727204282.94073: done queuing things up, now waiting for results queue to drain 22690 1727204282.94076: results queue empty 22690 1727204282.94077: checking for any_errors_fatal 22690 1727204282.94079: done checking for any_errors_fatal 22690 1727204282.94080: checking for max_fail_percentage 22690 1727204282.94081: done checking for max_fail_percentage 22690 1727204282.94082: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.94083: done checking to see if all hosts have failed 22690 1727204282.94083: getting the remaining hosts for this loop 22690 1727204282.94084: done getting the remaining hosts for this loop 22690 1727204282.94088: getting the next task for host managed-node2 22690 1727204282.94092: done getting next task for host managed-node2 22690 1727204282.94093: ^ task is: None 22690 1727204282.94095: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.94096: done queuing things up, now waiting for results queue to drain 22690 1727204282.94097: results queue empty 22690 1727204282.94098: checking for any_errors_fatal 22690 1727204282.94099: done checking for any_errors_fatal 22690 1727204282.94100: checking for max_fail_percentage 22690 1727204282.94101: done checking for max_fail_percentage 22690 1727204282.94102: checking to see if all hosts have failed and the running result is not ok 22690 1727204282.94103: done checking to see if all hosts have failed 22690 1727204282.94104: getting the next task for host managed-node2 22690 1727204282.94107: done getting next task for host managed-node2 22690 1727204282.94108: ^ task is: None 22690 1727204282.94109: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.94206: in VariableManager get_vars() 22690 1727204282.94226: done with get_vars() 22690 1727204282.94232: in VariableManager get_vars() 22690 1727204282.94242: done with get_vars() 22690 1727204282.94246: variable 'omit' from source: magic vars 22690 1727204282.94289: in VariableManager get_vars() 22690 1727204282.94329: done with get_vars() 22690 1727204282.94353: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 22690 1727204282.94642: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22690 1727204282.94668: getting the remaining hosts for this loop 22690 1727204282.94670: done getting the remaining hosts for this loop 22690 1727204282.94672: getting the next task for host managed-node2 22690 1727204282.94675: done getting next task for host managed-node2 22690 1727204282.94678: ^ task is: TASK: Gathering Facts 22690 1727204282.94679: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204282.94682: getting variables 22690 1727204282.94683: in VariableManager get_vars() 22690 1727204282.94700: Calling all_inventory to load vars for managed-node2 22690 1727204282.94702: Calling groups_inventory to load vars for managed-node2 22690 1727204282.94704: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204282.94708: Calling all_plugins_play to load vars for managed-node2 22690 1727204282.94710: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204282.94712: Calling groups_plugins_play to load vars for managed-node2 22690 1727204282.95690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204282.97103: done with get_vars() 22690 1727204282.97129: done getting variables 22690 1727204282.97169: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.367) 0:00:50.255 ***** 22690 1727204282.97188: entering _queue_task() for managed-node2/gather_facts 22690 1727204282.97498: worker is 1 (out of 1 available) 22690 1727204282.97512: exiting _queue_task() for managed-node2/gather_facts 22690 1727204282.97526: done queuing things up, now waiting for results queue to drain 22690 1727204282.97528: waiting for pending results... 22690 1727204282.97752: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22690 1727204282.97838: in run() - task 127b8e07-fff9-78bb-bf56-00000000056d 22690 1727204282.97849: variable 'ansible_search_path' from source: unknown 22690 1727204282.97902: calling self._execute() 22690 1727204282.97976: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.97982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.97989: variable 'omit' from source: magic vars 22690 1727204282.98471: variable 'ansible_distribution_major_version' from source: facts 22690 1727204282.98476: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204282.98479: variable 'omit' from source: magic vars 22690 1727204282.98482: variable 'omit' from source: magic vars 22690 1727204282.98484: variable 'omit' from source: magic vars 22690 1727204282.98538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204282.98585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204282.98617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204282.98642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.98660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204282.98700: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204282.98819: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.98823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.98836: Set connection var ansible_connection to ssh 22690 1727204282.98854: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204282.98869: Set connection var ansible_pipelining to False 22690 1727204282.98878: Set connection var ansible_shell_type to sh 22690 1727204282.98889: Set connection var ansible_shell_executable to /bin/sh 22690 1727204282.98902: Set connection var ansible_timeout to 10 22690 1727204282.98935: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.98943: variable 'ansible_connection' from source: unknown 22690 1727204282.98951: variable 'ansible_module_compression' from source: unknown 22690 1727204282.98959: variable 'ansible_shell_type' from source: unknown 22690 1727204282.98968: variable 'ansible_shell_executable' from source: unknown 22690 1727204282.98978: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204282.98992: variable 'ansible_pipelining' from source: unknown 22690 1727204282.99001: variable 'ansible_timeout' from source: unknown 22690 1727204282.99010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204282.99229: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204282.99252: variable 'omit' from source: magic vars 22690 1727204282.99269: starting attempt loop 22690 1727204282.99359: running the handler 22690 1727204282.99362: variable 'ansible_facts' from source: unknown 22690 1727204282.99367: _low_level_execute_command(): starting 22690 1727204282.99371: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204283.00160: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204283.00253: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.00311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204283.00339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.00361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.00443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.02236: stdout chunk (state=3): >>>/root <<< 22690 1727204283.02509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204283.02515: stdout chunk (state=3): >>><<< 22690 1727204283.02518: stderr chunk (state=3): >>><<< 22690 1727204283.02698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204283.02702: _low_level_execute_command(): starting 22690 1727204283.02705: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199 `" && echo ansible-tmp-1727204283.02569-26013-251404829782199="` echo /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199 `" ) && sleep 0' 22690 1727204283.04502: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.04712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.04811: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.04979: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.06946: stdout chunk (state=3): >>>ansible-tmp-1727204283.02569-26013-251404829782199=/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199 <<< 22690 1727204283.07204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204283.07208: stdout chunk (state=3): >>><<< 22690 1727204283.07211: stderr chunk (state=3): >>><<< 22690 1727204283.07404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204283.02569-26013-251404829782199=/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204283.07408: variable 'ansible_module_compression' from source: unknown 22690 1727204283.07410: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22690 1727204283.07591: variable 'ansible_facts' from source: unknown 22690 1727204283.07967: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py 22690 1727204283.08398: Sending initial data 22690 1727204283.08402: Sent initial data (152 bytes) 22690 1727204283.09653: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204283.09658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.09863: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204283.09869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.09940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.10035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.11680: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204283.11776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204283.11852: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpu2epbj93 /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py <<< 22690 1727204283.11856: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py" <<< 22690 1727204283.11927: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpu2epbj93" to remote "/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py" <<< 22690 1727204283.13839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204283.13899: stderr chunk (state=3): >>><<< 22690 1727204283.13903: stdout chunk (state=3): >>><<< 22690 1727204283.13931: done transferring module to remote 22690 1727204283.13941: _low_level_execute_command(): starting 22690 1727204283.13946: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/ /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py && sleep 0' 22690 1727204283.14512: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204283.14517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.14519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204283.14522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.14583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204283.14589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.14591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.14660: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.16487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204283.16583: stderr chunk (state=3): >>><<< 22690 1727204283.16591: stdout chunk (state=3): >>><<< 22690 1727204283.16704: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204283.16708: _low_level_execute_command(): starting 22690 1727204283.16716: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/AnsiballZ_setup.py && sleep 0' 22690 1727204283.17586: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204283.17590: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration <<< 22690 1727204283.17641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.17739: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204283.17787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.17790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.17867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.83854: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "03", "epoch": "1727204283", "epoch_int": "1727204283", "date": "2024-09-24", "time": "14:58:03", "iso8601_micro": "2024-09-24T18:58:03.472482Z", "iso8601": "2024-09-24T18:58:03Z", "iso8601_basic": "20240924T145803472482", "iso8601_basic_short": "20240924T145803", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 629, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316465664, "block_size": 4096, "block_total": 64479564, "block_available": 61356559, "block_used": 3123005, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.51708984375, "5m": 0.51220703125, "15m": 0.287109375}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22690 1727204283.86096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204283.86100: stdout chunk (state=3): >>><<< 22690 1727204283.86103: stderr chunk (state=3): >>><<< 22690 1727204283.86477: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-200.fc40.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 18:26:09 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d89b04801ed266210fa80047284a4", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQD4uYEQ0blFVMNECLbHg8I8bt6HNCICQsflF/UMpk0eQJSQTlNftYzqUlaTymYjywLWvlIgMPo/d7+0gWkz7meSAO3u1gPerjD7azC2rj4lDjn9Vg1hDqRXAvF48oRBmkteDXesfvJgtDWrND4QklBnAEPfXy5Sht7qaBII4184+SNWYPkchVgMfR7GRWuiHAZq4tU26+lUWBSIG3Z1JSc6J6pEdnRidQA8U+ehbsrKW8EoJouU9A6gTjp6xI3+eDFaiquKtKS9U/VDIm6IWeQqILeXHqEZdyOxAQHV8NIS1g2/d0eG52d0MdBV7ao+R6XiNgUX2bHtLjOwZF6nvUFI9PbRA3gw4W0McyfNnbVaxwAo/ykTDiz758mlsCgwnys1GpJ/v4uBa/qMPjdBYueVtI2qkUBGoZidUNsbwCtVdTRdKYD021PmBnByDiRxLxU7kgos/d2FTGN1ldphRSSXVa2OZ9BpLtOrOen0R7AomBUXMZ81zMBX6ks53q9Mrp0=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGO0u3rb9OTbm4p9jESbqD5+qtukT0zORAPdHbxncG9wMcTyPr227OZYgfQur6ZXfo7JLADaRBdG4ps3byawENw=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIPUtJu2WY63sQFdYodSkcdomCsm5asBz3nW0GoebskY/", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "03", "epoch": "1727204283", "epoch_int": "1727204283", "date": "2024-09-24", "time": "14:58:03", "iso8601_micro": "2024-09-24T18:58:03.472482Z", "iso8601": "2024-09-24T18:58:03Z", "iso8601_basic": "20240924T145803472482", "iso8601_basic_short": "20240924T145803", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3716, "ansible_memfree_mb": 3049, "ansible_swaptotal_mb": 3715, "ansible_swapfree_mb": 3715, "ansible_memory_mb": {"real": {"total": 3716, "used": 667, "free": 3049}, "nocache": {"free": 3482, "used": 234}, "swap": {"total": 3715, "free": 3715, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_uuid": "ec2d89b0-4801-ed26-6210-fa80047284a4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7610368", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["043ca97e-e9dd-43e0-af21-e3d1e20db391"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 629, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251316465664, "block_size": 4096, "block_total": 64479564, "block_available": 61356559, "block_used": 3123005, "inode_total": 16384000, "inode_available": 16301510, "inode_used": 82490, "uuid": "043ca97e-e9dd-43e0-af21-e3d1e20db391"}], "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "40", "ansible_distribution_major_version": "40", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.8.63 52596 10.31.47.73 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.8.63 52596 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.51708984375, "5m": 0.51220703125, "15m": 0.287109375}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-200.fc40.x86_64", "root": "UUID=043ca97e-e9dd-43e0-af21-e3d1e20db391", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5415c36-cd9b-4c4f-95be-3929d2c37184", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::f7:13ff:fe22:8fc1", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.47.73", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:f7:13:22:8f:c1", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.47.73"], "ansible_all_ipv6_addresses": ["fe80::f7:13ff:fe22:8fc1"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.47.73", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::f7:13ff:fe22:8fc1"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204283.87386: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204283.87624: _low_level_execute_command(): starting 22690 1727204283.87628: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204283.02569-26013-251404829782199/ > /dev/null 2>&1 && sleep 0' 22690 1727204283.88873: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204283.88877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204283.88933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204283.89071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204283.91004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204283.91079: stderr chunk (state=3): >>><<< 22690 1727204283.91088: stdout chunk (state=3): >>><<< 22690 1727204283.91118: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204283.91616: handler run complete 22690 1727204283.91619: variable 'ansible_facts' from source: unknown 22690 1727204283.91849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204283.92647: variable 'ansible_facts' from source: unknown 22690 1727204283.92951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204283.93323: attempt loop complete, returning result 22690 1727204283.93337: _execute() done 22690 1727204283.93347: dumping result to json 22690 1727204283.93384: done dumping result, returning 22690 1727204283.93410: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [127b8e07-fff9-78bb-bf56-00000000056d] 22690 1727204283.93423: sending task result for task 127b8e07-fff9-78bb-bf56-00000000056d 22690 1727204283.94301: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000056d 22690 1727204283.94305: WORKER PROCESS EXITING ok: [managed-node2] 22690 1727204283.94874: no more pending results, returning what we have 22690 1727204283.94877: results queue empty 22690 1727204283.94878: checking for any_errors_fatal 22690 1727204283.94880: done checking for any_errors_fatal 22690 1727204283.94881: checking for max_fail_percentage 22690 1727204283.94882: done checking for max_fail_percentage 22690 1727204283.94883: checking to see if all hosts have failed and the running result is not ok 22690 1727204283.94884: done checking to see if all hosts have failed 22690 1727204283.94885: getting the remaining hosts for this loop 22690 1727204283.94886: done getting the remaining hosts for this loop 22690 1727204283.94890: getting the next task for host managed-node2 22690 1727204283.94896: done getting next task for host managed-node2 22690 1727204283.94898: ^ task is: TASK: meta (flush_handlers) 22690 1727204283.94900: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204283.94904: getting variables 22690 1727204283.94905: in VariableManager get_vars() 22690 1727204283.94932: Calling all_inventory to load vars for managed-node2 22690 1727204283.94935: Calling groups_inventory to load vars for managed-node2 22690 1727204283.94939: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204283.94951: Calling all_plugins_play to load vars for managed-node2 22690 1727204283.94954: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204283.94957: Calling groups_plugins_play to load vars for managed-node2 22690 1727204283.99526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.05527: done with get_vars() 22690 1727204284.05675: done getting variables 22690 1727204284.05818: in VariableManager get_vars() 22690 1727204284.05869: Calling all_inventory to load vars for managed-node2 22690 1727204284.05873: Calling groups_inventory to load vars for managed-node2 22690 1727204284.05876: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.05882: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.05885: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.05888: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.09309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.12193: done with get_vars() 22690 1727204284.12248: done queuing things up, now waiting for results queue to drain 22690 1727204284.12255: results queue empty 22690 1727204284.12256: checking for any_errors_fatal 22690 1727204284.12266: done checking for any_errors_fatal 22690 1727204284.12267: checking for max_fail_percentage 22690 1727204284.12268: done checking for max_fail_percentage 22690 1727204284.12269: checking to see if all hosts have failed and the running result is not ok 22690 1727204284.12270: done checking to see if all hosts have failed 22690 1727204284.12278: getting the remaining hosts for this loop 22690 1727204284.12280: done getting the remaining hosts for this loop 22690 1727204284.12283: getting the next task for host managed-node2 22690 1727204284.12288: done getting next task for host managed-node2 22690 1727204284.12291: ^ task is: TASK: Verify network state restored to default 22690 1727204284.12293: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204284.12296: getting variables 22690 1727204284.12297: in VariableManager get_vars() 22690 1727204284.12310: Calling all_inventory to load vars for managed-node2 22690 1727204284.12312: Calling groups_inventory to load vars for managed-node2 22690 1727204284.12318: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.12326: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.12329: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.12332: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.14494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.17008: done with get_vars() 22690 1727204284.17044: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Tuesday 24 September 2024 14:58:04 -0400 (0:00:01.199) 0:00:51.454 ***** 22690 1727204284.17140: entering _queue_task() for managed-node2/include_tasks 22690 1727204284.17564: worker is 1 (out of 1 available) 22690 1727204284.17730: exiting _queue_task() for managed-node2/include_tasks 22690 1727204284.17744: done queuing things up, now waiting for results queue to drain 22690 1727204284.17745: waiting for pending results... 22690 1727204284.17949: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 22690 1727204284.18096: in run() - task 127b8e07-fff9-78bb-bf56-000000000078 22690 1727204284.18124: variable 'ansible_search_path' from source: unknown 22690 1727204284.18187: calling self._execute() 22690 1727204284.18438: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204284.18555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204284.18579: variable 'omit' from source: magic vars 22690 1727204284.19791: variable 'ansible_distribution_major_version' from source: facts 22690 1727204284.19867: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204284.19878: _execute() done 22690 1727204284.19888: dumping result to json 22690 1727204284.19893: done dumping result, returning 22690 1727204284.19895: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [127b8e07-fff9-78bb-bf56-000000000078] 22690 1727204284.20011: sending task result for task 127b8e07-fff9-78bb-bf56-000000000078 22690 1727204284.20344: done sending task result for task 127b8e07-fff9-78bb-bf56-000000000078 22690 1727204284.20348: WORKER PROCESS EXITING 22690 1727204284.20385: no more pending results, returning what we have 22690 1727204284.20390: in VariableManager get_vars() 22690 1727204284.20432: Calling all_inventory to load vars for managed-node2 22690 1727204284.20437: Calling groups_inventory to load vars for managed-node2 22690 1727204284.20441: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.20456: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.20466: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.20472: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.25355: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.28923: done with get_vars() 22690 1727204284.28952: variable 'ansible_search_path' from source: unknown 22690 1727204284.28978: we have included files to process 22690 1727204284.28979: generating all_blocks data 22690 1727204284.28981: done generating all_blocks data 22690 1727204284.28982: processing included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22690 1727204284.28983: loading included file: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22690 1727204284.28986: Loading data from /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22690 1727204284.29473: done processing included file 22690 1727204284.29475: iterating over new_blocks loaded from include file 22690 1727204284.29477: in VariableManager get_vars() 22690 1727204284.29490: done with get_vars() 22690 1727204284.29492: filtering new block on tags 22690 1727204284.29511: done filtering new block on tags 22690 1727204284.29521: done iterating over new_blocks loaded from include file included: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 22690 1727204284.29527: extending task lists for all hosts with included blocks 22690 1727204284.29564: done extending task lists 22690 1727204284.29568: done processing included files 22690 1727204284.29569: results queue empty 22690 1727204284.29569: checking for any_errors_fatal 22690 1727204284.29571: done checking for any_errors_fatal 22690 1727204284.29572: checking for max_fail_percentage 22690 1727204284.29573: done checking for max_fail_percentage 22690 1727204284.29574: checking to see if all hosts have failed and the running result is not ok 22690 1727204284.29574: done checking to see if all hosts have failed 22690 1727204284.29575: getting the remaining hosts for this loop 22690 1727204284.29576: done getting the remaining hosts for this loop 22690 1727204284.29579: getting the next task for host managed-node2 22690 1727204284.29583: done getting next task for host managed-node2 22690 1727204284.29585: ^ task is: TASK: Check routes and DNS 22690 1727204284.29587: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204284.29589: getting variables 22690 1727204284.29590: in VariableManager get_vars() 22690 1727204284.29600: Calling all_inventory to load vars for managed-node2 22690 1727204284.29603: Calling groups_inventory to load vars for managed-node2 22690 1727204284.29606: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.29612: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.29614: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.29617: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.31324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.32539: done with get_vars() 22690 1727204284.32570: done getting variables 22690 1727204284.32638: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:58:04 -0400 (0:00:00.155) 0:00:51.610 ***** 22690 1727204284.32675: entering _queue_task() for managed-node2/shell 22690 1727204284.33183: worker is 1 (out of 1 available) 22690 1727204284.33202: exiting _queue_task() for managed-node2/shell 22690 1727204284.33219: done queuing things up, now waiting for results queue to drain 22690 1727204284.33220: waiting for pending results... 22690 1727204284.34255: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 22690 1727204284.34262: in run() - task 127b8e07-fff9-78bb-bf56-00000000057e 22690 1727204284.34286: variable 'ansible_search_path' from source: unknown 22690 1727204284.34292: variable 'ansible_search_path' from source: unknown 22690 1727204284.34297: calling self._execute() 22690 1727204284.34302: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204284.34306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204284.34310: variable 'omit' from source: magic vars 22690 1727204284.34903: variable 'ansible_distribution_major_version' from source: facts 22690 1727204284.34908: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204284.34911: variable 'omit' from source: magic vars 22690 1727204284.34915: variable 'omit' from source: magic vars 22690 1727204284.34918: variable 'omit' from source: magic vars 22690 1727204284.34921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22690 1727204284.34923: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22690 1727204284.34963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22690 1727204284.35012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204284.35026: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22690 1727204284.35030: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22690 1727204284.35032: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204284.35035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204284.35374: Set connection var ansible_connection to ssh 22690 1727204284.35378: Set connection var ansible_module_compression to ZIP_DEFLATED 22690 1727204284.35382: Set connection var ansible_pipelining to False 22690 1727204284.35384: Set connection var ansible_shell_type to sh 22690 1727204284.35387: Set connection var ansible_shell_executable to /bin/sh 22690 1727204284.35390: Set connection var ansible_timeout to 10 22690 1727204284.35397: variable 'ansible_shell_executable' from source: unknown 22690 1727204284.35403: variable 'ansible_connection' from source: unknown 22690 1727204284.35409: variable 'ansible_module_compression' from source: unknown 22690 1727204284.35412: variable 'ansible_shell_type' from source: unknown 22690 1727204284.35418: variable 'ansible_shell_executable' from source: unknown 22690 1727204284.35423: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204284.35426: variable 'ansible_pipelining' from source: unknown 22690 1727204284.35429: variable 'ansible_timeout' from source: unknown 22690 1727204284.35432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204284.35593: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204284.35599: variable 'omit' from source: magic vars 22690 1727204284.35602: starting attempt loop 22690 1727204284.35605: running the handler 22690 1727204284.35608: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 22690 1727204284.35611: _low_level_execute_command(): starting 22690 1727204284.35615: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22690 1727204284.36476: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204284.36505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204284.36516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.36557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204284.36561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.36585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.36664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.38438: stdout chunk (state=3): >>>/root <<< 22690 1727204284.38588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204284.38799: stderr chunk (state=3): >>><<< 22690 1727204284.38803: stdout chunk (state=3): >>><<< 22690 1727204284.39010: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204284.39016: _low_level_execute_command(): starting 22690 1727204284.39023: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564 `" && echo ansible-tmp-1727204284.3889325-26060-134737878347564="` echo /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564 `" ) && sleep 0' 22690 1727204284.40809: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204284.40830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204284.40898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204284.41154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.41285: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.41389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.43401: stdout chunk (state=3): >>>ansible-tmp-1727204284.3889325-26060-134737878347564=/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564 <<< 22690 1727204284.43694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204284.43720: stderr chunk (state=3): >>><<< 22690 1727204284.43724: stdout chunk (state=3): >>><<< 22690 1727204284.43812: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204284.3889325-26060-134737878347564=/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564 , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204284.43876: variable 'ansible_module_compression' from source: unknown 22690 1727204284.44094: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22690l01f6ep2/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22690 1727204284.44148: variable 'ansible_facts' from source: unknown 22690 1727204284.44377: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py 22690 1727204284.44794: Sending initial data 22690 1727204284.44799: Sent initial data (156 bytes) 22690 1727204284.45582: stderr chunk (state=3): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.45693: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.45697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204284.45699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.45768: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.45934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.47537: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 22690 1727204284.47613: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22690 1727204284.47650: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22690 1727204284.47756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpzaqjf57s /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py <<< 22690 1727204284.47760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py" <<< 22690 1727204284.47971: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-22690l01f6ep2/tmpzaqjf57s" to remote "/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py" <<< 22690 1727204284.50993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204284.51040: stderr chunk (state=3): >>><<< 22690 1727204284.51044: stdout chunk (state=3): >>><<< 22690 1727204284.51084: done transferring module to remote 22690 1727204284.51097: _low_level_execute_command(): starting 22690 1727204284.51100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/ /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py && sleep 0' 22690 1727204284.51775: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204284.51784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204284.51797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204284.51833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204284.51837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204284.51840: stderr chunk (state=3): >>>debug2: match not found <<< 22690 1727204284.51851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.51854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22690 1727204284.51862: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.47.73 is address <<< 22690 1727204284.51992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22690 1727204284.51997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22690 1727204284.52000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22690 1727204284.52002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22690 1727204284.52004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 <<< 22690 1727204284.52007: stderr chunk (state=3): >>>debug2: match found <<< 22690 1727204284.52009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.52011: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204284.52016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.52167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.52237: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.54250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204284.54254: stdout chunk (state=3): >>><<< 22690 1727204284.54257: stderr chunk (state=3): >>><<< 22690 1727204284.54307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204284.54316: _low_level_execute_command(): starting 22690 1727204284.54319: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/AnsiballZ_command.py && sleep 0' 22690 1727204284.55295: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204284.55302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.55376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204284.55417: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.55421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.55692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.73134: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3230sec preferred_lft 3230sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:04.720496", "end": "2024-09-24 14:58:04.729715", "delta": "0:00:00.009219", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22690 1727204284.74831: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. <<< 22690 1727204284.75045: stderr chunk (state=3): >>><<< 22690 1727204284.75050: stdout chunk (state=3): >>><<< 22690 1727204284.75216: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3230sec preferred_lft 3230sec\n inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:04.720496", "end": "2024-09-24 14:58:04.729715", "delta": "0:00:00.009219", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.47.73 closed. 22690 1727204284.75324: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22690 1727204284.75328: _low_level_execute_command(): starting 22690 1727204284.75330: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204284.3889325-26060-134737878347564/ > /dev/null 2>&1 && sleep 0' 22690 1727204284.76574: stderr chunk (state=2): >>>OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 <<< 22690 1727204284.76856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22690 1727204284.76934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' <<< 22690 1727204284.76983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22690 1727204284.77000: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22690 1727204284.77181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22690 1727204284.79130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22690 1727204284.79187: stderr chunk (state=3): >>><<< 22690 1727204284.79192: stdout chunk (state=3): >>><<< 22690 1727204284.79209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.6p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.47.73 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.47.73 originally 10.31.47.73 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/7ef5e35320' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22690 1727204284.79219: handler run complete 22690 1727204284.79240: Evaluated conditional (False): False 22690 1727204284.79249: attempt loop complete, returning result 22690 1727204284.79252: _execute() done 22690 1727204284.79255: dumping result to json 22690 1727204284.79261: done dumping result, returning 22690 1727204284.79270: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [127b8e07-fff9-78bb-bf56-00000000057e] 22690 1727204284.79275: sending task result for task 127b8e07-fff9-78bb-bf56-00000000057e 22690 1727204284.79397: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000057e 22690 1727204284.79400: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009219", "end": "2024-09-24 14:58:04.729715", "rc": 0, "start": "2024-09-24 14:58:04.720496" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:f7:13:22:8f:c1 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.47.73/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3230sec preferred_lft 3230sec inet6 fe80::f7:13ff:fe22:8fc1/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.47.73 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.47.73 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 22690 1727204284.79481: no more pending results, returning what we have 22690 1727204284.79486: results queue empty 22690 1727204284.79487: checking for any_errors_fatal 22690 1727204284.79489: done checking for any_errors_fatal 22690 1727204284.79490: checking for max_fail_percentage 22690 1727204284.79492: done checking for max_fail_percentage 22690 1727204284.79493: checking to see if all hosts have failed and the running result is not ok 22690 1727204284.79494: done checking to see if all hosts have failed 22690 1727204284.79495: getting the remaining hosts for this loop 22690 1727204284.79496: done getting the remaining hosts for this loop 22690 1727204284.79500: getting the next task for host managed-node2 22690 1727204284.79506: done getting next task for host managed-node2 22690 1727204284.79510: ^ task is: TASK: Verify DNS and network connectivity 22690 1727204284.79515: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204284.79526: getting variables 22690 1727204284.79528: in VariableManager get_vars() 22690 1727204284.79557: Calling all_inventory to load vars for managed-node2 22690 1727204284.79560: Calling groups_inventory to load vars for managed-node2 22690 1727204284.79563: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.79622: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.79625: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.79630: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.82116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.83320: done with get_vars() 22690 1727204284.83346: done getting variables 22690 1727204284.83403: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:58:04 -0400 (0:00:00.507) 0:00:52.117 ***** 22690 1727204284.83429: entering _queue_task() for managed-node2/shell 22690 1727204284.83727: worker is 1 (out of 1 available) 22690 1727204284.83744: exiting _queue_task() for managed-node2/shell 22690 1727204284.83757: done queuing things up, now waiting for results queue to drain 22690 1727204284.83759: waiting for pending results... 22690 1727204284.84006: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 22690 1727204284.84067: in run() - task 127b8e07-fff9-78bb-bf56-00000000057f 22690 1727204284.84071: variable 'ansible_search_path' from source: unknown 22690 1727204284.84075: variable 'ansible_search_path' from source: unknown 22690 1727204284.84111: calling self._execute() 22690 1727204284.84224: variable 'ansible_host' from source: host vars for 'managed-node2' 22690 1727204284.84279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22690 1727204284.84283: variable 'omit' from source: magic vars 22690 1727204284.84715: variable 'ansible_distribution_major_version' from source: facts 22690 1727204284.84735: Evaluated conditional (ansible_distribution_major_version != '6'): True 22690 1727204284.84897: variable 'ansible_facts' from source: unknown 22690 1727204284.85950: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 22690 1727204284.86022: when evaluation is False, skipping this task 22690 1727204284.86026: _execute() done 22690 1727204284.86029: dumping result to json 22690 1727204284.86032: done dumping result, returning 22690 1727204284.86035: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [127b8e07-fff9-78bb-bf56-00000000057f] 22690 1727204284.86038: sending task result for task 127b8e07-fff9-78bb-bf56-00000000057f skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 22690 1727204284.86191: no more pending results, returning what we have 22690 1727204284.86196: results queue empty 22690 1727204284.86197: checking for any_errors_fatal 22690 1727204284.86211: done checking for any_errors_fatal 22690 1727204284.86212: checking for max_fail_percentage 22690 1727204284.86217: done checking for max_fail_percentage 22690 1727204284.86218: checking to see if all hosts have failed and the running result is not ok 22690 1727204284.86219: done checking to see if all hosts have failed 22690 1727204284.86220: getting the remaining hosts for this loop 22690 1727204284.86222: done getting the remaining hosts for this loop 22690 1727204284.86227: getting the next task for host managed-node2 22690 1727204284.86237: done getting next task for host managed-node2 22690 1727204284.86240: ^ task is: TASK: meta (flush_handlers) 22690 1727204284.86242: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204284.86246: getting variables 22690 1727204284.86248: in VariableManager get_vars() 22690 1727204284.86286: Calling all_inventory to load vars for managed-node2 22690 1727204284.86290: Calling groups_inventory to load vars for managed-node2 22690 1727204284.86294: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.86312: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.86319: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.86323: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.87201: done sending task result for task 127b8e07-fff9-78bb-bf56-00000000057f 22690 1727204284.87206: WORKER PROCESS EXITING 22690 1727204284.88471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.90817: done with get_vars() 22690 1727204284.90850: done getting variables 22690 1727204284.90931: in VariableManager get_vars() 22690 1727204284.90944: Calling all_inventory to load vars for managed-node2 22690 1727204284.90946: Calling groups_inventory to load vars for managed-node2 22690 1727204284.90950: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.90956: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.90958: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.90961: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.92500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.94715: done with get_vars() 22690 1727204284.94759: done queuing things up, now waiting for results queue to drain 22690 1727204284.94762: results queue empty 22690 1727204284.94763: checking for any_errors_fatal 22690 1727204284.94768: done checking for any_errors_fatal 22690 1727204284.94769: checking for max_fail_percentage 22690 1727204284.94770: done checking for max_fail_percentage 22690 1727204284.94771: checking to see if all hosts have failed and the running result is not ok 22690 1727204284.94772: done checking to see if all hosts have failed 22690 1727204284.94773: getting the remaining hosts for this loop 22690 1727204284.94774: done getting the remaining hosts for this loop 22690 1727204284.94777: getting the next task for host managed-node2 22690 1727204284.94782: done getting next task for host managed-node2 22690 1727204284.94783: ^ task is: TASK: meta (flush_handlers) 22690 1727204284.94785: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204284.94789: getting variables 22690 1727204284.94790: in VariableManager get_vars() 22690 1727204284.94802: Calling all_inventory to load vars for managed-node2 22690 1727204284.94804: Calling groups_inventory to load vars for managed-node2 22690 1727204284.94807: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.94817: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.94819: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.94823: Calling groups_plugins_play to load vars for managed-node2 22690 1727204284.96428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204284.98594: done with get_vars() 22690 1727204284.98633: done getting variables 22690 1727204284.98696: in VariableManager get_vars() 22690 1727204284.98708: Calling all_inventory to load vars for managed-node2 22690 1727204284.98710: Calling groups_inventory to load vars for managed-node2 22690 1727204284.98716: Calling all_plugins_inventory to load vars for managed-node2 22690 1727204284.98721: Calling all_plugins_play to load vars for managed-node2 22690 1727204284.98724: Calling groups_plugins_inventory to load vars for managed-node2 22690 1727204284.98727: Calling groups_plugins_play to load vars for managed-node2 22690 1727204285.00248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22690 1727204285.02436: done with get_vars() 22690 1727204285.02467: done queuing things up, now waiting for results queue to drain 22690 1727204285.02469: results queue empty 22690 1727204285.02470: checking for any_errors_fatal 22690 1727204285.02471: done checking for any_errors_fatal 22690 1727204285.02471: checking for max_fail_percentage 22690 1727204285.02472: done checking for max_fail_percentage 22690 1727204285.02473: checking to see if all hosts have failed and the running result is not ok 22690 1727204285.02473: done checking to see if all hosts have failed 22690 1727204285.02474: getting the remaining hosts for this loop 22690 1727204285.02475: done getting the remaining hosts for this loop 22690 1727204285.02482: getting the next task for host managed-node2 22690 1727204285.02484: done getting next task for host managed-node2 22690 1727204285.02485: ^ task is: None 22690 1727204285.02486: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22690 1727204285.02487: done queuing things up, now waiting for results queue to drain 22690 1727204285.02487: results queue empty 22690 1727204285.02488: checking for any_errors_fatal 22690 1727204285.02488: done checking for any_errors_fatal 22690 1727204285.02489: checking for max_fail_percentage 22690 1727204285.02489: done checking for max_fail_percentage 22690 1727204285.02490: checking to see if all hosts have failed and the running result is not ok 22690 1727204285.02490: done checking to see if all hosts have failed 22690 1727204285.02491: getting the next task for host managed-node2 22690 1727204285.02493: done getting next task for host managed-node2 22690 1727204285.02493: ^ task is: None 22690 1727204285.02494: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Tuesday 24 September 2024 14:58:05 -0400 (0:00:00.191) 0:00:52.309 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.81s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.63s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.49s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 1.97s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Install iproute --------------------------------------------------------- 1.60s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.55s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.27s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Create veth interface lsr27 --------------------------------------------- 1.25s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.20s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.19s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Gathering Facts --------------------------------------------------------- 1.10s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.06s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Check if system is ostree ----------------------------------------------- 0.98s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Gathering Facts --------------------------------------------------------- 0.97s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.94s /tmp/collections-MVC/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 22690 1727204285.02597: RUNNING CLEANUP